Columbia Engineering researchers develop way to combat VR sickness

Columbia Engineering Professor Steven K. Feiner and Ajoy Fernandes MS’16 say they have developed a method of combating virtual reality (VR) sickness that can be applied to consumer head-worn VR displays, such as the Oculus Rift, HTC Vive, Sony PlayStation VR, and Google Cardboard.

Their approach dynamically changes the user’s field of view (FOV) in response to visually perceived motion, as the user virtually traverses an environment while remaining physically stationary. By strategically and automatically manipulating FOV, the degree of VR sickness experienced by participants seems to be significantly reduced. The researchers say they accomplished this without decreasing the participants’ sense of presence in the virtual environment, and without the majority of the participants even being aware of the intervention.

The team specifically targeted scenarios in which users move in the virtual environment in a way that intentionally differs from how they move in the real world. These include games in which they are physically standing or sitting on a couch in their living room, while walking, running, driving, or flying in the virtual world. In these scenarios, the visual motion cues that users see are at odds with the physical motion cues that they receive from their inner ears’ vestibular system, the cues that provide us with our sense of motion, equilibrium, and spatial orientation. When the visual and vestibular cues conflict, users can feel quite uncomfortable, even nauseated.

In many cases, decreasing the field of view can decrease these symptoms. But decreasing FOV can also decrease the user’s sense of presence in the virtual environment, making the experience less compelling. Feiner and Fernandes focused on subtly decreasing FOV in situations when a larger FOV would be likely to cause VR sickness (when the mismatch between physical and virtual motion increases) and restoring it when VR sickness is less likely to occur (when the mismatch decreases). They developed software that functions as a pair of “dynamic FOV restrictors” and can partially obscure each eye’s view with a virtual soft-edged cutout. They then determined how much the user’s field of view should be reduced, and the speed with which it should be reduced and then restored, and tested the system.

This research was funded in part by the National Science Foundation under Grants IIS-0905569 and IIS-1514429.

[Video courtesy: Ajoy Fernandes and Steve Feiner/Columbia Engineering]

Also see:

Jumbo, Hanshow ink partnership to digitize 700 stores

Jumbo, the second largest supermarket chain in the Netherlands, has announced an exclusive partnership with global retail solutions provider Hanshow.

Cognizant awarded five-year IT operations and maintenance contract by VodafoneZiggo

As managed services partner of VodafoneZiggo, Cognizant will consolidate and support the company's operations of IT and virtualized mobile network infrastructure.

U.S. Air Force awards VR medical simulation training SimX contract

SimX has been awarded an R&D contract by the U.S. Air Force to develop enhanced VR medical simulation training capabilities for battlefield tactical combat casualty care (TCCC).