Redirected Walking – Playing with your perception’s limits

“Redirected walking allows users to walk through large-scale immersive Virtual Environments (IVEs) while physically remaining in a reasonable small workspace.”

I have already talked a bit about this technique that I discovered back at 3DUI 2008 but which was introduced by UNC Chapel Hill. The principle is quite simple:  suppose you have a virtual world potentially infinite, and you want to physically walk this world (with a HMD for example). If you directly apply your real movements to your virtual self, you’ll run into the walls of your small room quite soon.

What’s nice about VR is that it allows you to fool your senses. You’re already fooling your visual sense with incredible graphics (tell me you’re not using a 3DFX anymore!), so why not cheat your sense of movement, which relies a lot on visual cues?

So instead of having a 1:1 mapping from real to virtual, we could modify the translation and rotation speed applied to the avatar. For example, a rotation of the user of 90° would result in a rotation of 100° or 80° in the VE. Same for translation, 1m in reality could result in 1.2m or 0.8m in the VE.

rdw-4.jpg

What’s more is that the opposite is also possible : suppose you walk in a straight line in the VE, we could have you walk along a curve in reality!

At VRST 2008, Frank Steinicke from the Muenster University  and David Engel from the Max Planck Institute of Tuebingen each presented a paper on this topic at VRST 2008 : “Analyses of Human Sensitity to Redirected Walking” and “A Psychophysically calibrated controller for navigating through large environments in a limited free-walking space“.

rdw-5b.jpg

Dr Steinicke’s presentation’s focus was on the evaluation of the limits of redirected walking, whereas M. Engel’s  work is about how to dynamically change the gains of translation and rotation to fit the user’s path in a known environment.

I’ve asked them both a few questions about their work :

> What’s your background, interests and goals with this work ? 

Frank Steinicke : My research interests include human-computer interaction with special consideration on VR, perception and cognition in computer generated environments and visualizations. The goal of this project is to provide intuitive interfaces for exploring VEs. As a matter of fact the most natural way of locomotion in the real world is walking. So, from a computer graphics perspective I am very interested in the question how real walking through CG environments can be realized if only a limited laboratory environments is available. I’m fascinated by the psychophysical phenomena, which allow to trick users, for example, in such a way that they unknowingly move on a path in the real world that differs from the path they perceive in the VE. Of course, we want to know how much we can trick users without them noticing discrepancies.

David Engel : I’ve studied computer science in Tübingen and one of my main interest always has been virtual environments. I first came in contact with redirected walking during a presentation at APGV 2007. I found the idea of being able to explore infinite virtual worlds, like in the Holodeck from Star Trek, very compelling.

> Where does this idea comes from? 

Frank : Redirected walking (RDW) techniques have been used for a few years now. The main idea of redirected walking has been introduced by Sharif Razzaque et al. from the UNC at Chapel Hill. RDW is based on the fact that the visual sense dominates proprioceptive and vestibular senses. This has been shown much earlier, for example, by Alain Berthoz (who gave the keynote talk at VRST).

David : The idea to determine the redirection factors dynamically, emerged from the problems I encountered during my early implementations of redirected walking. As soon as the paths through the virtual environments got longer the errors in the predicted positions accumulated to a point where the users kept running into the walls.

rdw-1.jpg

rdw-2.jpg

rdw-3.jpg

> What are, for you, the most interesting conclusions of your paper ?

Frank : We have identified detection thresholds up to which humans can be redirected in such a way that they do not perceive any discrepancies. For example, we know that we can guide them on a circle with a radius of approximately 23 meters, while they believe that they walk straight in the VE. This guidance approach can be realized by injecting small rotations to one side, which enforce users to unknowingly walk on a circular arc in the opposite direction.

Until now, we have only considered basic walking techniques such as forward movements and rotations. I think these concepts can be adapted to any kind of motion, such as walking on slopes and strafe otions. Furthermore, we have not addressed adaptation, which involves the question how far humans might adapt to redirected walking. Such adaptation has been considered before, for example, with left–right reversed vision, but not in the context of VR-based environments.

David : With a dynamic optimization approach based on the minimization of a cost function we gain a large amount of robustness against deviations from the predefined path and can introduce much flexibility into the redirected walking approach. By adjusting the terms of the cost function we can adapt to new boundary conditions such as the current users sensitivity profile to redirection factors, route choices and multiple users.

On the psychophysical side our next step will be to evaluate the user performance in navigation tasks when redirection techniques are applied. On the technical side we plan to support a more natural exploration of the virtual environment by allowing more complex and branching paths.

> To what extent do you think we will be able to fool the body by taking advantage of its perception limitations? (including other senses than proprioception)

Frank : The same concepts can be applied, for instance, for haptic feedback. Anatole Lecuyer from INRIA has shown that haptic feedback can be induced by visual stimuli, and Luv Kohli has examined how much the haptic feedback may vary from the visual stimuli. Another topic might be the question in how far time can be compressed or stretched in the virtual world and how the perception of time can be changed.

David : The sensitivity of the perceptual system to sensory conflicts seems to depend largely on the amount of congruency between the modalities (e.g. if auditory cues are collocated with the real or virtual world) and the attention given to the different modalities. The extent to which the perception can be fooled might therefore be very task-dependent. In short term the usage of congruent multi-modal input and distractor tasks should allow for a much wider range in which the user can be fooled. In the long term redirected walking techniques could become more sophisticated by taking into account the sensitivities of the user to cue conflicts at different points during the gait cycle. Combining such ideas we should be able to overcome the spatial limitations of the available facilities.

Thank you both for your time and interesting work!

You May Also Like