Skip to main content
Humans+Robots

The (very) dark side of virtual reality

It can be used for games, for empathy, for therapy. But it could also be used for torture.

By Heather Kapplow

A virtual reality headset covers my eyes, the view in all directions is entirely black, and I hear guns being cocked and a voice giving instructions to attack. Weapons fire at a distance, followed by the sound of a door being broken down.

A moment later, I can see again. I find myself across a table from a bearded, blindfolded man with blood on his shirt. A gun and a camera lie on the table between us. Someone else is in the room too, whimpering.

This scene, which I’m experiencing all too vividly, is set in Evin Prison in Tehran, Iran. It’s part of Blindfold, an unsettling virtual reality experience created by New York-based indie gaming studio Ink Stories. Blindfold is a highly realistic simulation of interrogations that journalists working in conflict regions often undergo when detained.

Ink Stories worked closely with the Center for Human Rights in Iran and the Committee to Protect Journalists to make the simulation this harrowing, using techniques from documentary filmmaking and video game development, along with details and textures from the real-life experiences of journalists and others who’ve been freed from Evin. When an interrogator grills you in Blindfold — you’re commanded to nod your head yes or no — it feels genuinely chilling.

But the goal isn’t just to build realism, says Ink Stories’ founder, Navid Khonsari. It’s to create a visceral sensation of empathy, and shed light on the kinds of decisions that Blindfold forces its players to make.

The game has been used to de-glamorize war correspondence for journalism students, lending a touch of realism to the kind of situations they could face in real life. But the Blindfold experience also offers a wider warning. More and more, ethicists and VR practitioners are asking whether the techniques that make a program like Blindfold so immersive and realistic could someday be used not just to depict psychological torture, but to inflict it.


Virtual-reality technology — like the internet — emerged out of research by military organizations, for warfare-related purposes. Flight simulation, used to train American fighter pilots for battle, goes all the way back to World War I, with motion and computer graphics first integrated into military simulations in the early 1970s. Attempts at immersive entertainment, such as the ambitious but unsuccessful Sensorama Simulator, date back to the mid-1950s, but VR really began thriving as computer graphics were refined. By the time Virtual Reality got its name — from two former Atari programmers who left to start the first VR company in the mid-1980s — all of the functions of current VR were possible, if ungraceful.

Now, the applications are broad and varied, ranging from virtual real estate tours to research and development in manufacturing, training for surgeons, and entertainment. As the graphics get smoother and the interfaces ever more seamless, the line between “virtual reality” and actual reality is starting to grow thinner.

For instance, one cutting-edge area of exploration in VR is a field known as “embodiment technology.” In the mid-2000s, the European Union put almost 10 million Euros towards academic research into “dissolving the boundary between the human body and surrogate representations in immersive virtual reality and physical reality.”

Around the same time, BeAnotherLab, an international collective of artists and scientists, created a low-budget, interactive installation project called Machine To Be Another, often displayed at galleries or festivals. The viewer wears a VR headset while an actor, wearing a 3D camera, transmits images back — so the viewer literally sees the world through someone else’s eyes. The physical boundaries between the two people are blurred as well. The actor synchronizes the viewer’s physical movements — touching items, for instance, at the same time the viewer touches them.

Like Blindfold, Machine to Be Another was designed for altruistic purposes. The program has been used to help viewers experience someone else’s perspective — often, a victim of prejudice, such as a migrant, an asylum-seeker, or a transgender person. But at a certain point in their work, BeAnother members began to raise concerns about the technology’s potential to trigger more frightening neurological reactions. In 2016, after discovering scientific studies about “body transfer illusion” and “vicarious pain” — phenomena that cause us to perceive that we’re feeling another body’s sensations, including its physical suffering — some BeAnother members told the magazine Kill Screen that virtual reality held “the potential to induce severe pain or suffering, whether physically or mentally.”

“Say instead of having to strap a headset on you, they implant a type of contact lens. And you don’t know what’s real and what isn’t. Now you’ve got a system for torture.”

Today, BeAnother doesn’t talk as much about those concerns. Christian Cherene, one of the group’s founders, says members decided that even speculating about misuse of VR technology could give bad actors nefarious ideas.

But Cherene notes that BeAnother’s basic approach of “hacking sensory motor perception” has already been used by people looking to inflict torture while adhering to the letter of the law. He points to waterboarding — a technique of simulated drowning, classified internationally as torture, used by the U.S. at Guantanamo Bay in the early 2000s.

“Waterboarding is hacking your sensory motor perception as well,” he says. “You’re not, legally speaking, drowning someone, and even the person being waterboarded knows intellectually that they aren’t drowning. But from the body’s perspective, it’s drowning.”

Virtual reality can be used to trick the body in similar ways, Cherene says.

“You can create experiences that aren’t necessarily really what is happening,” he says. “That allows you to maybe skirt around certain legal situations that might inhibit you otherwise.”


Within the engineering field, there is growing urgency around setting ethical standards for the development of VR technology.  Albert “Skip” Rizzo, director for medical virtual reality at the University of Southern California’s Institute for Creative Technologies,The institute bills itself as “a recognized leader in the development of virtual humans who look, think, and behave like real people.” is part of a working group that has written ethical guidelines for artificial intelligence and virtual reality products for the Institute of Electrical and Electronics Engineers (IEEE).

Rizzo’s own work proves that VR, used ethically, has great potential to do good — even when it’s simulating traumatic experiences. His immersive video games, developed in collaboration with the U.S. Army Research Laboratory, provide exposure therapy — the process of slowly re-introducing and acclimating patients to traumatic stimuli — to help veterans recover from battlefield-induced PTSD.

IEEE expects its members to uphold a code of ethics that echoes medicine’s Hippocratic Oath, or risk damaging their professional reputations. The group’s guide, Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems, suggests requiring all VR interface designers to include an always-present way for a user to “opt out” of any immersive environment, as well as new ways for people to include their preferences for data sharing in their user profiles. (As these products evolve, they are expected to become increasingly tailored to individual users, in much the same way that social media or Netflix are.)  

In Rizzo’s games, patients sit in physical “jeeps” that resemble amusement park rides, watching animated videos that simulate war field experiences. Therapists can adjust a scenario’s time of day, landscape, setting, and the languages that the programs’ virtual humans speak to simulate the traumatic moment that still troubles the veteran.

While the VR technology is a bit clunky, Rizzo says extensive research has proven that it’s effective in reducing PTSD symptoms. “It doesn’t have to be an exact replica of reality for people to have an emotional response to it,” he explains.

Today’s technology today is still primitive enough to feel demonstrably “virtual,” Rizzo says. As realistic as the details of a game like Blindfold are, the physical nature of the apparatus is enough of a hurdle that the brain’s frontal lobe, which keeps people oriented in space and time, is never tricked into believing it is actually in Evin Prison.

But he can foresee a future when the boundaries are less clear.

“Say, for example, instead of having to strap a headset on you, they knock you out and implant a certain type of contact lens onto your eyes. And you don’t know what’s real and what isn’t,” he says. “Now you’ve got a system for torture.”

It’s hard to predict when technology this immersive will be possible. And perhaps that’s not the point, Rizzo suggests.

“The whole issue is in who has control over the virtual reality,” Rizzo says. “If you put someone in a virtual reality gas chamber, you could do that to torture someone, or you could do it as an exercise in empathy-building.”

The potential for harm raises questions that many casual VR users haven’t even begun to confront, about who will regulate and drive the technology going forward, and who — governments, developers, investors, consumers — will be responsible for setting ethical boundaries.

As Ink Stories asks Blindfold’s players, so early adopters of virtual reality must ask: “Will you accept the full weight of your decisions?” 

Published on

Heather Kapplow is a writer based in Boston.

 

Photograph by Andrzej Wojcicki/Science Photo Library

Humans+Robots

How to prepare for sentient AI

And why the experts hope it never happens

By Schuyler Velasco