When the virtual world becomes real

When online encounters start to feel physical, safety and trust become more urgent. How can we make sure virtual worlds remain safe for everyone? This question was one of the topics explored during the Spring School on Social XR, organized by CWI’s Distributed and Interactive Systems group.

Social XR - extended reality in a social setting - allows physical and virtual worlds to blend, making distances between people almost disappear. Imagine a family member on the other side of the world, someone you now talk to over a video call. As the technology develops, that person could one day appear to be sitting next to you at the kitchen table in a virtual 3D form, or you could walk through a virtual museum together.

That scenario is not quite within reach yet. But for many years, it has been possible to step into the body of an avatar and enter an online world where you can meet like-minded people. During the Spring School on Social XR, four researchers gave public lectures on both the promise and the risks of these developments.

Inside the content

In her talk, Professor Yvette Wohn explored safety and trust in embodied online spaces. Wohn leads the Social Interaction Lab at the New Jersey Institute of Technology and studies human-computer interaction from both psychological and sociological perspectives.

“Some people talk about the virtual world and the ‘real’ world. But what happens in virtual reality is also real,” she says. “It is not the same as reading a book or watching a film. In those cases, you separate reality from content. In XR, you are inside the content. Because of that embodied aspect, many of the issues you encounter outside the virtual world are almost the same inside it.”

Wohn’s own introduction to VR began in the early 2000s with Second Life, the 3D virtual world where users could socialize, build and trade through an avatar. “As with almost any technology, you can do good and creative things with it, but also bad things,” says Wohn, who has a home there herself. She and the creators of Second Life soon found that rules were needed. Next to her virtual home, someone built a tower covered in advertisements for escort services and sexual services.

Photo: Shutterstock

Harassment gets a body

As in the 2D online world, problems such as harassment, bullying, and illegal or harmful content began to appear. “Harassment already existed in the early days of the internet,” says Wohn. “Back then, it took the form of text messages. In Social XR, it becomes embodied.”

She gives the example of a female executive who was harassed in a virtual world. A group of men surrounded her avatar so that she could no longer move, while subjecting her to sexually explicit comments. Meta subsequently introduced a four-foot personal boundary around avatars, designed to prevent this kind of blocking.

But many online users are more vulnerable than someone in a position to speak out publicly, Wohn notes. This includes children, minorities, people from the LGBTQI+ community, and people who feel socially awkward and may be more comfortable interacting online.

“What makes them especially vulnerable is the data linked to their personal profile, which is not visible to others: when and where they log in, their profile information.” There are plenty of examples, she says, of malicious actors gaining access to such information.

Haptics adds another layer. If the technology becomes widely available, users may eventually be able to feel and touch in virtual worlds through wearable devices. That also creates new risks. “It means you could touch someone, push someone, or even hit someone. Or someone could take over your haptic devices and cause discomfort or unwanted sensations.”

Moderation is not enough

“How much freedom will you allow?” Wohn asked during her lecture. Restricting users to pre-written sentences may seem safe, she noted, but often creates frustration rather than meaningful protection.

Moderation and AI detection can help, but only up to a point. Platforms generate more content than humans can check, while users quickly find ways around automated filters, for example through alternative spellings or emojis.

Part of the answer, Wohn argues, lies in the design of the online environment. Platforms are starting to label AI-generated content, so that users know what is real and what is not. They can also introduce parental supervision settings and age verification before certain content can be accessed. “It is important that adults can do adult things in XR, but with a safe space for minors.”

Yvette Wohn

The designers of technologies such as haptics should also think about these issues in advance, she says. “Technology always comes with a risk. Developers should ask themselves: how will people use this?”

Education

Better detection algorithms, clearer rules, design mechanisms that discourage harmful behaviour, and consequences for misconduct - such as banning someone from a platform - can all contribute to safer online spaces. But they are not enough on their own.

As Wohn puts it: “Companies do not fix humans.” Her research shows that education is the most effective approach. All parties need to understand the technology, she argues: the company that sells it, the factory that produces it, the users, the parents. “A more concentrated effort is needed to educate people online.”

Together with her colleagues Wohn is looking at the role of parents and teachers. How can adults talk to young people about harmful online behaviour without unintentionally giving them new ideas?

Small communities

She also argues for micro-communities: smaller online spaces where people can connect with peers and agree on their own norms. In very broad online environments, she says, it is almost impossible to create rules that feel right to everyone.

A word of advice

“If people think of a virtual world as a world that is not real, they are wrong. VR is very real,” says Wohn. “One of my first studies was about people having socially intimate relationships in the virtual world. For them, it started as something virtual, but the dating was real, with real emotions. If you look at it that way, you may be more careful in VR.”

Her work deals with both the harmful and the hopeful sides of online interaction. “My research is about love and hate,” she says. “Technology can be used for bad and for good. It is about how people use it.”

Header photo: Minnie Middelberg

Young woman wearing VR headset conversing with a cartoon character avatar via an futuristic screen hologram.

More information