I’ve had the great honnor to be invited by Betty Mohler, who is currently working at the Max Planck Institute in Tuebingen, Germany, to come and visit their research projects that use VR.
Part of what they do there is to try and understand the biological basis for our perception, especially movement perception.
The Biological Cybernetics department is headed by Professor Heinrich Buelthoff who is a strong believer in high-level VR tools as a way to focus on experiments and not lose time by reprogramming basic components.
Read after the jump if you want to know more about your body and the crazy VR systems they have there!
The first thing that strikes when you arrive there is that lots of the buildings are quite new, and that they have some really impressive systems as you’ll see. Also the working language is english as the teams are pretty international (American, English, French, Italian, Bulgarian …). Does this even exist in France ? I’d say if you want to work in France without speaking french it would be quite difficult, but please prove me wrong =)
The two latest (and most impressive) systems of the MPI (for me anyway) are the Omni-directionnal treadmill and the Kuka robot, but they are doing amazing research with the other systems.
The Kuka robot
The Kuka robot is a huge robotic arm with a seat at the end which allows the researchers to do some impressive movement simulation (or rollercoaster simulations on the coffee breaks).
The movements are really smooth and precise. I’ve been able to test it for 10 minutes with a helicopter simulation ( designed by Jean-Pierre Bresciani, Michael Kerger, Harald Teufel and Paolo Robuffo Giordano), and the feeling is really terrific.
Depending on the experiment they can put a screen in front of the user. I “simply” had to align the seat with one among four bullseye, and I could only control the horizontal movement.
As an helicopter is quite unstable, you have to learn how to control it. My first attempts got me circling around the room pretty fast!
In real life I would probably have crashed 10 helicopters by the time I could begin to understand how to handle it.
But after my second session I could pretty much go where I wanted quite fast, and that was a really powerful sensation.
I felt like the first time that I came out of a CAVE, like a kid in a candy store =)
The black room
Then I could also have some time with the huge Omni-directionnal treadmill, which allows you to move indefinitely in any direction. I’ll make another post about that specifically.
The room is actually even bigger, all painted in black, and has 12 Vicon cameras mounted at 3 meters heigt which allow full body tracking in the whole room !! Thanks to the black walls they are able to achieve complete darkness so that the only visual information you have come from your HMD.
(They also have a Moven inertial tracking suit)
This is the room in which David Engel makes his experiments on motion redirection, but it is used for other experiments as well.
For example I’ve been able to experience the famous pit demo, by Cengiz Terzibas, Betty and Stephan Streuber. Wearing a HMD and freely walking into the room, you have to cross a pit hole by walking on a wooden bridge. The wood piece actually also exists in reality (think “passive haptics”) so when you’re stepping up on this bridge in VR, you’re really feeling it under your feet.
This creates a very strong illusion of presence, which will also make you feel the height of the hole. If you fool more senses, you will get more immersion.
Betty Mohler is also running experiments on locomotion and gait parameters in VEs and how visual motion influence locomotion.
She also likes to have her guests wear a very tight costume with markers on it to do full body motion capture.
Then she maps those data to an avatar in realtime, and you can actually see your virtual self in the virtual environment.
One application will be to try and find if seeing yourself in a VE you get better at evaluating distances.
For the moment, people have a real tendency to underestimate distances in VEs, and we still don’t know why although some clues are being debated.
The motion platform
I could also test a Motion Platform like the ones they have in theme parks for rides, but for a single user.
John Butler, Frank Nieuwenhuizen and Cengiz Terzibas are conducting researches about movement perception.
I tested an application where they were trying to understand how we integrate the visual movement information and the vestibular information.
The platform would move in a direction, and the screen would show a corresponding visual movement, or a movement in the opposite direction.
Then they would ask people what they thought the real movement was. The result this case is that people trust their inner ear more than their eyes.
Which was surprising to me because the CRVM has another experiment showing how visual information is stronger than vestibular information when they conflict. They put you in their CAVE and have a virtual room rotate so that the perceived vertical is also rotating … until you fall down. Since you are physically still, the vestibular information remains the same. So in that case you believe your eyes.
John explained to me that in this case the horizon is a really strong visual cue that sort of overrides the vestibular information. So this subject is still open =)
The Panolab is used by Paolo Pretto and Jean-Pierre Bresciani to study perception of speed, for example when adding fog. It seems that when there’s fog, you think you’re going faster than you actually are.
It’s funny because I met there Joachim Tesch, who is their new technical guy, who used to work for iCinema with Matthew McGinity, who visited us 2 months ago and used to work in France with Remy Deslignes who is now working with us at Virtools.. Small world eh ?
Also Joachim and Stephan Streuber are running for the “VR Experience” contest.
So this was a really incredible visit and I learned a lot of fundamental things about how our body perceives its environment. This is one aspect I really love about VR; use VR to learn about your body, then use this knowledge to improve your VR experience!
So I again want to thank Betty for her invitation and the whole team for taking some of their precious time to explain their research to me !
I tried to keep this article short but understandable so I hope I didn’t corrupt your teaching and I hope I didn’t forget to cite too many people =)
As they really like to share, they also have a Youtube channel if you want to follow their next experiments.