Mar 01 2024
Virtual Walking
When I use my virtual reality gear I do practical zero virtual walking – meaning that I don’t have my avatar walk while I am not walking. I general play standing up which means I can move around the space in my office mapped by my VR software – so I am physically walking to move in the game. If I need to move beyond the limits of my physical space, I teleport – point to where I want to go and instantly move there. The reason for this is that virtual walking creates severe motion sickness for me, especially if there is even the slightest up and down movement.
But researchers are working on ways to make virtual walking a more compelling, realistic, and less nausea-inducing experience. A team from the Toyohashi University of Technology and the University of Tokyo studied virtual walking and introduced two new variables – they added a shadow to the avatar, and they added vibration sensation to the feet. An avatar is a virtual representation of the user in the virtual space. Most applications allow some level of user control over how the avatar is viewed, but typically either first person (you are looking through the avatar’s eyes) or third person (typically your perspective is floating above and behind the avatar). In this study they used only first person perspective, which makes sense since they were trying to see how realistic an experience they can create.
The shadow was always placed in front of the avatar and moved with the avatar. This may seem like a little thing, but it provides visual feedback connecting the desired movements of the user with the movements of the avatar. As weird as this sounds, this is often all that it takes to not only feel as if the user controls the avatar but is embodied within the avatar. (More on this below.) Also they added four pads to the bottom of the feet, two on each foot, on the toe-pad and the heel. These vibrated in coordination with the virtual avatar’s foot strikes. How did these two types of sensory feedback affect user perception?
They found:
“Our findings indicate that the synchronized foot vibrations enhanced telepresence as well as self-motion, walking, and leg-action sensations, while also reducing instances of nausea and disorientation sickness. The avatar’s cast shadow was found to improve telepresence and leg-action sensation, but had no impact on self-motion and walking sensation. These results suggest that observation of the self-body cast shadow does not directly improve walking sensation, but is effective in enhancing telepresence and leg-action sensation, while foot vibrations are effective in improving telepresence and walking experience and reducing instances of cybersickness.”
So the shadow made people feel more like they were in the virtual world (telepresence) and that they were moving their legs, even when they weren’t. But the shadow did not seem to enhance the sensation of walking. Meanwhile the foot vibrations improved the sense of telepresence and leg movement, but also the sense that the user was actually walking. Further (and this is of keen interest to me) the foot vibrations also reduced motion sickness and nausea. Keep in mind, the entire time the user is sitting in a chair.
I do not find the telepresence or sense of movement surprising. It is now well established that this is how the brain usually works to create the sensation that we occupy our bodies and own and control the parts of our bodies. These sensations do not flow automatically from the fact that we are our bodies and do control them. There are specific circuits in the brain that create these sensations, and if those circuits are disrupted people can have out-of-body sensations or even feel disconnected from parts of their body. These circuits depend on sensory feedback.
What is happening is that our brains are comparing various information streams in real time – what movements do we intend to make, visual feedback regarding whether or not our body is moving in the way we intend, combined with physical sensation such as proprioception (feeling where your body is in three dimensional space) and tactile sensation. When everything lines up, we feel as if we occupy and control our bodies. When they don’t line up, weird stuff happens.
The same is true for motion sickness. Our brains compare several streams of information at once – visual information, proprioception, and vestibular information (sensing gravity and acceleration). When these sensory streams do not match up, we feel vertigo (spinning sensation) or motion sickness. Sometimes people have just a vague sense of “dizziness” without overt spinning – they are just off.
In VR there can be a complete mismatch between visual input and vestibular input. My eyes are telling me that I am running over a landscape, while my vestibular system is telling me I am not moving. The main way this is currently addressed is by not having virtual movement, hence the teleporting (which does not count as movement visually). Another potential way to deal with this is to have physical movement match the virtual movement, but this requires a large and expensive rig, which is currently not ready for consumer use. This is the Ready Player One scenario – a harness and an omnidirectional treadmill. This would probably be the best solution, and I suspect you would need only a little bit of movement to significantly reduce motion sickness, as long as it was properly synchronized.
There has also been speculation that perhaps motion sickness can be reduced by leveraging other sensory inputs, such as haptic feedback. There has also been research into using brain stimulation to reduce the effect. A 2023 study looked at “transcranial alternating current stimulation (tACS) at 10 Hz, biophysically modelled to reach the vestibular cortex bilaterally.” I look at this as a proof of concept, not a likely practical solution. But perhaps some lower tech stimulation might be effective.
I am a little surprised, although pleased, that in the current study a little haptic feedback of the feet lowered motion sickness. My hope is that as the virtual experience gets more multi-modal, with several sensory streams all synchronized, the motion sickness problem will be mostly resolved. In the current study, if the provided picture (see above) is any indication, the users were walking through virtual streets. This would not provide a lot of up and down movement, which is the killer. So perhaps haptic feedback might work for situations that would create mild motion sickness, but I doubt it would be enough for me to survive a virtual roller coaster.
All of this bodes well for a Ready Player One future – with mature VR including haptic feedback with some physical motion. I do wonder if the brain hacking (brain stimulation) component will be necessary or practical in the near future.
One last aside – the other solution to the motion sickness problem is AR – augmented reality. With AR you can see the physical world around you through the goggles, which overlay virtual information. This way you are moving through the physical world, which can be skinned to look very different or have virtual objects added. This does not work for every VR application, however, and is limited because you need the physical space to move around in. But applications and games built around what AR can do has the added benefit of no motion sickness.