Trips to virtual worlds are not always pleasant for women who try them for the first time
The metaverse is still in the concept stage, but the latest attempts to create virtual worlds already face an age-old problem: harassment.
Technology columnist Bloomberg Parmy Olson told the show Tech Tent from the BBC about his own “creepy” experiences.
And a woman compared her own traumatic experience in virtual reality with sexual abuse.
Meta, the company of Mark Zuckerberg, now announced a new feature, Personal Boundary (in Spanish, “Personal Limit”), which began to be implemented on February 4 and prevents avatars from approaching a certain distance, which creates more personal space for people and makes it easier to avoid these unwanted interactions.
This feature prevents others from “invading your avatar’s personal space,” Meta said.
“If someone tries to enter your ‘personal limit’, the system will stop their forward movement when they reach the limit.”
It is available in the software Horizon Worlds and Horizon Venues from Meta.
The firm said it was a “powerful example of how virtual reality has the potential to help people interact comfortably,” but acknowledged that there is more work to be done.
For some, the news will be welcome.
Look
Facebook’s Oculus Go glasses don’t require a PC or smartphone to work
Meta Quest, the company known until recently as Oculus, is Mark Zuckerberg’s big bet to boost his vision of metaverse
” I had some moments where it was uncomfortable for me as a woman, ” Olson said of her VR interactions.
I was visiting Meta’s Horizon Worlds, their virtual reality platform where anyone over the age of 18 can create an avatar and hang out.
To do so, users need one of the following: Meta virtual reality devices, and the space offers the possibility to play and chat with other avatars, none of which has legs.
“I could see right away that I was the only woman, the only female avatar. Y these men surrounded me and watched me in silence“, Olson told Tech Tent.
Continue reading the story
“Then they started taking pictures of me and giving them to me, and I had a moment some guy came up to me and said something.”, continue.
“In virtual reality, if someone is close to you, then the voice sounds like someone is literally talking in your ear. And it took me by surprise, ” he added.
Olson experienced similar discomfort in the Microsoft social virtual reality platform.
“I was talking to another woman and within a few minutes of chatting, a guy came up and started talking to us and continuing to say inappropriate things to us, and we had to block him, ” she said.
“Since then I’ve heard of other women who have had similar experiences,” she said.
The tech columnist indicated that although she would not describe it as harassment, it was “creepy and uncomfortable”.
Sexual abuse
Nina Patel and her avatar at Horizon Venues, where she was attacked by a group of participants in the Facebook metaverse
Nina Patel and her avatar at Horizon Venues, where she was attacked by a group of participants in the Facebook metaverse
Nina Jane Patel went much further days ago when she told the Daily Mail that she was abused at Horizon Venuescomparing it to sexual assault.
Patel described how a group of male avatars “touched” her and subjected her to a string of sexual advances. They photographed her and sent her a message that said, ” Don’t pretend you didn’t love it.”
Meta responded to the paper saying he was sorry “ ” We want everyone to have a positive experience and easily find the safety tools that can help in a situation like this, and help us investigate and take action.”
Metaverse opportunities and threats
Meta CTO Andrew Bosworth says there can be a balance between privacy and security in virtual spaces
Meta CTO Andrew Bosworth says there can be a balance between privacy and security in virtual spaces
Moderating content on the nascent metaverse will be a challenge, and Meta’s chief technology officer Andrew Bosworth admitted he would offer “greater opportunities and greater threats”.
“It could seem a lot more real to me if you were abusing me, because it feels a lot more like a physical space,” he said in an interview with the BBC late last year.
But he assured that people in virtual roles would have “much more power” over their environments.
“If I were to silence you, you would cease to exist for me and your ability to harm me would be immediately nullified,” he said.
Bosworth questioned whether people would want the kind of moderation that exists on platforms like Facebook when they have virtual reality chats.
“Do you really want the system or a person listening to you? Probably not” ” he asked and responded at the same time.
“So I think we have a privacy trade-off: if you want to have a high degree of content, security or what we would call integrity, well, that’s offset by privacy.”
And in the Meta vision of the metaverse, where different rooms are run by different companies, compensation becomes even more complex as people move from the Meta-controlled virtual world to others.
“I cannot guarantee the privacy or integrity of that conversation”, say.
Olson agreed that it was going to be “a very difficult thing to fix for Facebook, Microsoft and others.”
“When you’re scanning text for hate speech, it’s difficult but it can be done; you can use machine learning algorithms,” he said.
“(Instead,) processing visual information about an avatar or how close it is to each other, that is going to be very computationally expensive, it is going to consume a lot of computing power; I do not know what technology can do that,” he reasoned.
The ethics of virtual worlds
Avatars without Meta legs can have all kinds of experiences, private or community, in the metaverse
Avatars without Meta legs can have all kinds of experiences, private or community, in the metaverse
Facebook you are investing US $ 10 billion in your metaverse plans and part of that you’ll need to keep building new ways to moderate content.
” We’ve learned a lot in the last 15 years of Internet discourse, so we’re going to bring all that knowledge to do the best we can to build these things from scratch, to give people a lot of control over their own experience, ” Bosworth told the BBC.
Beth Singler, an anthropologist at the University of Cambridge who has studied the ethics of virtual worlds, said: “Facebook has already failed to learn about what happens in spaces on the Internet. Yes, they have changed some of their policies, but there is still material out there that shouldn’t be there.”
There is more to learn from games, think, where Second Life and World of Warcraft they have offered virtual worlds for years, limiting who they can talk to and the names they can choose for them.
Meta’s decision to use legless avatars can also be deliberate, she says, most likely a technical decision about the lack of leg sensors, but it could also be a way to limit problems ” from the waist down” that could arise in case of having a full physical presence.
However, having strict rules about how avatars look can bring their own problems for those who “try to express a certain identity,” he added.