• Pages

  • Recent Posts

  • Recent Comments

  • Archives

  • Mon 29 Mar 2010

    IEEE VR 2010 – Perceptive illusions

    Published at 17:24   Category Virtual Reality  

    This year the major conference of our field has been held in Waltham, USA, in the Boston area.

    I could only be there for three days, 2 days of 3DUI and 1 day of IEEE VR, and as I was chairing the 3DUI contest I couldn’t attend to as much conferences as I would have liked to.

    So I’ll first talk about an interesting topic that was widely discussed there and later try to talk about the rest.

    Perceptive Illusions

    I don’t know if it’s a real trend or if it’s juste because I’m getting more and more interested in that, but I believe there are more and more papers about perceptive illusions; knowing the limits of our perception and taking advantage of these limitations to overcome current limitations of VR systems.

    Check out the PIVE workshop website and its proceedings.

    These illusions mostly use our visual system to distract other senses. I have already talked about Redirected Walking, which don’t exactly match the real user’s position and orientation to the virtual one so that he can walk virtually in a greater area than the real one.

    Now two papers use change blindness : Evan A. Suma (“Exploiting Change Blindness to Expand Walkable Space in a Virtual Environment”) and Frank Steinicke (“Change Blindness Phenomena for Stereoscopic Projection Systems”) :

    Steinicke : (…) modifications to certain objects can literally go unnoticed when the visual attention is not focused on them. Change blindness denotes the inability of the human eye to detect modifications of the scene that are rather obvious–once they have been identified. These scene changes can be of various types and magnitudes, for example, prominent objects could appear and disappear, change color, or shift position by a few degrees. (…) Such change blindness effects have great potential for virtual re- ality (VR) environments, since they allow abrupt changes of the visual scene which are unnoticeable for users. Current research on human perception in virtual environments (VEs) focuses on identifying just-noticeable differences and detection thresholds that al- low the gradual introduction of imperceptible changes to a visual scene. Both of these approaches–abrupt changes and gradual changes–exploit limitations of the visual system in order to introduce significant changes to a virtual scene. (…) These change blindness studies have led to the con- clusion that the internal representation of the visual field is much more sparse than the subjective experience of “seeing” suggests and essentially contains only information about objects that are of interest to the observer

    Suma : Since usually the architecture of an environment does not suddenly change in the real world, these assumptions may carry over into the virtual world. Thus, subtle changes to the scene that occur outside of the user’s field of view may go unnoticed, and can be exploited to redirect the user’s walking path. Figure 1 shows an example modification where the doorway to exit a room is rotated, causing users to walk down the virtual hallway in a different direction than when they first entered the room. These “doorway switches” can be used to allow the user to explore an environment much larger than the physical workspace.


    Another paper by Tabitha Peck (“Improved Redirection with Distractors: A Large-Scale-Real-Walking Locomotion Interface and its Effect on Navigation in Virtual Environments”) allows you to walk naturally :

    We designed and built Improved Redirection with Distractors (IRD), a locomotion interface that enables users to really walk in larger-than-tracked-space VEs. Our locomotion interface imper- ceptibly rotates the VE around the user, as in Redirected Walking [15], while eliminating the need for predefined waypoints by using distractors (…)

    The scene is always slowly rotating so the user is always more or less walking towards the center of the real room. If the user is too close to a real wall, the system will create a distracting object, like a butterfly, so that the user looks at it. While he is following the distractor with its head, the scene is also rotated so that when the user looks back at the original position, he is facing away from the wall.

    We rotate the scene based on head turn rate because user perception of rotation is most inaccurate during head turns.

    This paper by Maud Marchal (“Walking Up and Down in Immersive Virtual Worlds: Novel Interactive Techniques Based on Visual Feedback”) also modifies the view of the user while walking to give him the sensation of going up or down :

    We introduce novel interactive techniques to simulate the sensation of walking up and down in immersive virtual worlds based on vi- sual feedback. Our method consists in modifying the motion of the virtual subjective camera while the user is really walking in an immersive virtual environment. The modification of the virtual viewpoint is a function of the variations in the height of the virtual ground.

    This is very similar to the pseudo-haptics applications by Anatole Lecuyer. Maybe it’s because they work in the same team at Inria ;)

    Speaking of pseudo-haptics, the best poster went to Luv Kohli for “Redirected Touching: Warping Space to Remap Passive Haptics” :

    This poster explores the possibility of mapping many differently shaped virtual objects onto one physical object by warping virtual space and exploiting the dominance of the visual system. (…) This technique causes the user’s avatar hand to move in virtual directions different from its real-world motion, such that the real and avatar hands reach the real and virtual objects simultaneously.


    This means that with one real object, you can simulate the touch of multiple virtual objects. Your visual system will override your sense of touch.

    You can do the same with your olfaction, as described in a paper by Aiko Nambu (“Visual-Olfactory Display Using Olfactory Sensory Map”). Creating a smell requires a lot of different basic smells. By displaying corresponding objects, you can trick your nose into thinking that the scent of a lemon is actually smelling like an orange if you dispay an orange, or the scent of peach can actually be used for strawberries.

    It’s really incredible what you learn thanks to VR. We think our perception is so perfect and flawless, but in fact we have a really bad perception of our environment, but it’s good enough to survive..

    Thu 25 Mar 2010

    3DUI Grand Prize 2010 – The winners !

    Published at 2:53   Category Virtual Reality  

    The first 3DUI Grand Prize is finally over !

    After a whole year of preparation, 23 teams from all over the world joined the contest. Only 12 of them sent us a video of their solution to participate in the video contest. You can see them all on the Youtube channel.

    Of these 12 teams, 4 managed to make it to the conference, bringing all their equipment to participate in the live contest !

    A quick reminder of what this contest is about : we wanted to allow anyone to express their creativity and show new ideas for 3D interactions. We gave the teams a 3d model and a task : in a supermarket, you start in front of a table where three objects are sitting. You have to fnd the same objects in the supermarket, bring them back to the table and position them in the same orientation as the original object.

    This task combines three basic 3d interactions, namely travel, selection and manipulation.

    The teams could use whatever software and hardware they wanted.

    The video contest winner is the Three Girls & Three Guys team from the University of Hasselt in Belgium (Lode Vanacken, Sofie Notelaers , Johanna Octavia, Anastasiia Beznosyk , Tom De Weyer, Steven Maesen) :

    YouTube Preview Image

    This fun video also demonstrates a solid mastering of 3D interactions. The navigation is working really well and is innovative, the selection is very efficient, and the manipulation is simple but efficient.

    They win a Novint Falcon, a Space Navigator, and a license of 3DVIA Virtools + VR Library.

    The live contest winner is the Fighting Gobblers team from Virginia Tech, USA (Felipe Bacim, Regis Kopper, Anamary Leal, Tao Ni, Doug Bowman) :

    YouTube Preview Image

    The navigation is innovative and fun but may require some learning to be really fast, the selection is very fast, and the manipulation is a bit awkward.

    They win two Novint Falcons, one spacepilot, one license of 3DVIA Virtools + VR Library.


    - Felipe Bacim showing the Fighting Gobblers solution to Yoshifumi Kitamura  -

    The second place of the video contest goes to the VRM team from Ukraine (Maxim Lysak, Viktor Kuropyatnik) :

    YouTube Preview Image

    VRM is the only private company that participated in the contest and they showed that by simply using a gamepad you can achieve the task quite efficiently.

    They win one license of 3DVIA Virtools + VR Library.

    And last but definitely not least, the second place of the live contest goes to the im.ve ChairIO team from University of Hamburg, Germany (Steffi Beckhaus, Kristopher J. Blom, Matthias Haringer):

    YouTube Preview Image

    Their self-made chair is a very interesting travel device, and they managed to create a whole VR system for the conference!

    They win one Novint Falcon, one Space Pilot, and one license of 3DVIA Virtools + VR Library.


    - Steffi Beckhaus laughing at me on the ChairIO ! -

    The jury was composed of : Pablo Figueroa, Yoshifumi Kitamura, Chad Wingrave, Ernst Kruijff, Anatole Lecuyer and me.

    We had a hard time deciding between the Fighting Gobblers and the im.ve team. In the end, it’s the public vote that made the difference.


    - Andy Wu (behind the table) and Derek Reilly of the team GVU Twinspace -


    - Dat Nguyen and Timofey Grechkin of the HANK Lab showing their solution to Pablo Figueroa -

    The contest attracted lots of people and we’re very happy that at a 3D interaction conference you could actually put your hands on 3d interactions ! We also have seen very interesting and very diverse solutions.

    So we think this first edition, imperfect as it was, was in the end a big sucess !

    We would really like to thank all the teams that participated in this contest. They have put a lot of effort, even more so for the four teams that came to the conference. Everyone who showed up won a licence of 3DVIA Virtools + VR Library.

    And thanks to the sponsors (Novint, Immersion SA, 3DVIA, AFRV, SIG 3DUI) for the great gifts !

    You can find more pictures here and here.

    YouTube Preview Image YouTube Preview Image YouTube Preview Image YouTube Preview Image
    Sat 13 Mar 2010

    Heavy Rain and Plausibility Illusion

    Published at 13:04   Category Game, VR Applications  

    Heavy Rain is not your usual game. It’s dark, it’s emotionnal and to me it feels very real.

    YouTube Preview Image

    Heavy Rain is the story of a father whose kid has been kidnapped and his endeavour to get him back.


    The game uses several tricks to create immersion.

    The first and most simple one is that you are almost always playing. There are numerous cut-scenes which look like a video, but suddenly you have to perform an action. If you don’t have the gamepad in hand at this moment, you’ll fail the action. This forces you to always be alert and ready.

    Then if you want to perform an action (you have the choice not to), you’ll have to do precise movements with your joystick, moreover at a correct speed; for example if you want to reach out to an object to your right, simply push the joystick to the right. If you want to open a door, you’ll have to do an half circle, mimicking the rotation of the door. Remember, gesture gives you more immersion. I can’t help but thinking how this would be much more natural in VR !!

    Then the game happens in realtime which means you sometimes have to think and act fast : do I have to shoot this guy before he kills my partner (that I don’t like) ? But he might be useful to my investigation ! But I need my partner even more ? Will I be able to reason with him ?  Damn he might pull the trigger any second now !! *BAM* … Damn I shot him .. did I make the correct choice? Could I save him ? As in real life, you’ll never know.

    That’s one beauty of the game: each of your action has consequences on the story. David Cage, creator of the game and head of the french game studio Quantic Dreams, has written more than 2000 pages for this game which has 23 different endings. If you don’t have your kid do his homework, he will be angry at you the next day because his teacher didn’t like that. If you don’t kill the guy you’ve been ordered to kill, you won’t have clues to find your son. What if you die ? The game goes on with the other 3 characters.

    It is also very realistic because you have to use your brain realistically. No puzzles or crazy wayfinding. You’re in a rush and have to phone a room in a motel. Damn, can you remember the room number that you’ve seen several times ? Or your on a crime scene (but you don’t know that) before the cops arrive and you’ve touched several objects. Will you remember which ones to be able to clean them all and erase all your traces ? As in real life, you’re left on your own with your aging memory. Same for human interactions, will you have empathy? Will you be cold? Use your heart intelligence.

    If you remember Mel Slater’s latest paper about presence, it talks about a concept called Plausibility Illusion :

    Plausibility Illusion is the illusion that what is apparently happening is really happening. This results from a sense that your actions have effects on the VE, that other events of the VE affect your sensations, and that these events are credible.

    This is exactly what is happening in Heavy Rain; through all the points mentionned above, they have managed to achieve Plausibility Illusion, more or less cognitive immersion.

    Imagine if on top of that you had Place Illusion, more or less perceptive immersion. This would be Presence, with a capital ‘P’, the graal of immersive VR (iVR).

    In a previous article, I also talked about how the game Mirror’s Edge feels quite real to me : when I play it, I can feel the wind on my face and the void beneath my feet as I jump from one building to another.

    Mirror’s Edge is the first game to hack your proprioception. (…) When you feel like you’re truly inside your character, speed suddenly means something. The opposite is also true. Without a sense of physicality, speed feels lifeless.

    So who will be creating Mirror’s Rain ? Or Heavy Edge ? Mixing this proprioception hacking with plausibility illusion.

    A first person VR game like that would feel very intense.


    How could VR Geeks not like this game ? At some points in Heavy Rain you use a VR desktop :

    YouTube Preview Image

    I like the idea of extending the real desk.

    Ok there are some negatives sides to the game. Athough everything looks very real, you’re often right in the uncanny valley; the faces of the characters are great but their movements are a bit stiff. And I’m not at ease with the phases that require quick actions on the gamepad buttons.

    But appart from that, it’s really a milestone for games, and for me a major step towards VR games.

    Sat 13 Mar 2010

    Cave for sale

    Published at 10:46   Category Virtual Reality  

    It seems that because of the crisis, some vr geeks need money. Thus they’re selling their Cave !!! I didn’t imagine this was possible but hey why not. I have some space in my appartment, maybe if we get together we can buy it !

    Immersive 4 sided Cube for sale

    Following the crisis in the Emirates, one of our clients must quickly sell its 4 sided immersive cube installed in november 2008 :

    • Active projectors Christie Digital Mirage S+3K (SXGA+)
    • 4 faces : 3m wide x 2.3m high
    • 5.1 sound
    • ART Tracking

    This CAVE, installed in november 2008 is worth 565’000€. It is sold at 350’000€ , not including the shipping costs. Filters and lamps are new.

    If you’re interested let me know, I’ll pass it on, but only if you let me play with it!

    (The following picture *is not* for sale, it’s the one from the CRVM in Marseille)

    Thu 11 Mar 2010

    Playstation Move – 3D Tracker

    Published at 17:29   Category Augmented Reality, Game development, VR Devices  

    In the end of 2006, Sony was already talking about it. Now they’ve finally made a demonstration at GDC of their upcoming 3D tracker, named Playstation Move :

    YouTube Preview Image

    Tt has gyroscopes, accelerometers, magnetometer for high rate updates and the Playstation Eye webcam watches the glowing sphere to recalibrate the position at each frame. It should also have two distinct elements, ala Wiimote & Nunchuck.

    Develop-Online has more specs :

    “The latency for the Playstation Move is under one frame” – Scott Rohde, vice president of product development, SCEA.

    PlayStation®Move motion controller
    Three-axis gyroscope
    Three-axis accelerometer
    Terrestrial megnetic field sensor
    Colour-changing sphere for Playstation Eye tracking
    Bluetooth® technology
    Vibration feedback

    PlayStation®Move sub-controller
    Built-in lithium-ion rechargeable battery
    Bluetooth® technology
    2 DUALSHOCK® or SIXAXIS® Wireless Controller replacement capability.

    PlayStation® Eye
    Built-in four-capsule microphone array
    Echo cancellation
    Background noise suppression

    “Under $100” (£47)

    I’ve been able to test a pre-version and must admit I was really impressed with the reactivity and precision of that device. As you can see, it can also be used for nice AR applications :

    YouTube Preview Image YouTube Preview Image

    Yes, it’s much better than a Wiimote since it’s a real, absolute 6DOF tracker (as long as the camera sees the sphere) . Ok with the Wiimotion plus it’s getting better.

    Contrary to the wiimote, which has an embedded camera that sees the infrared dots, the camera is now on your TV looking at the device.

    They both have the same occlusion problem, meaning that if the camera doesn’t see the marker, the inertial data will rapidly drift to become useless. This can happen is someone stands in front of you or if the device is behind you.

    It’s a good start for VR games, especially if you can stick one to your head !

    And as the PS3 is supposed to support stereoscopic displays soon, this will get exciting :)

    ( Especially if, as the rumour says, Killzone 3 uses the Move and is S3D compliant !!!)

    Mon 8 Mar 2010

    Augmented Reality pool game

    Published at 13:14   Category Augmented Reality, Game  

    In his blog Xavier Gouchet talks about a really nice application example that will be shown at Laval Virtual in april : an AR pool game !

    Queen’s University
    Authors: Samuel Jordan, Michael Greenspan
    Abstract: ARPool is an augmented reality system designed to assist shot planning and execution in a game of billiards. Using a projector-camera combination, ARPool is capable of detecting the ID and location of each ball on the table, as well as tracking the position and orientation of the pool cue in real-time. This information is fed through a custom pool physics simulator to obtain a complete table-state timeline of the shot. The shot data is dynamically rendered in real-time directly on the surface of the table using 2D graphics.

    YouTube Preview Image

    A useful, practical, usable (sellable?) tool if you ask me !

    Fri 5 Mar 2010

    Looking for a bed in NY, 15th to 19th March

    Published at 13:39   Category Perso  


    I’ll be in NY from 15th to 19th March before going to 3DUI near Boston, and I’d love to meet some VR geeks there !

    Also if you have a bed for me, even for one or two nights, I’d happily exchange that with extensive VR discussions ;)

    What interesting VR centers are there in the big apple ?