• Pages

  • Recent Posts

  • Recent Comments

  • Archives

  • Wed 10 Dec 2008

    [Call For Demo] Laval Virtual ReVolution 2009

    Published at 12:52   Category Virtual Reality  

    From Akihiko SHIRAI’s blog :

    Laval Virtual ReVolution 2009
    -World Performance of VR Applications-

    Image Laval Virtual ReVolution is an annual honor to the world’s
    finest VR project, by Laval Virtual. This is a hall of fame award that
    decides the best Virtual Reality demonstration and/or application from
    all over the world. Virtual Reality is not only a technology but also
    a never ending story between computers and human history.
    We cannot get the final answer of this issue immediately, as we still
    need to find and walk one of the many possible paths that will lead us
    to a future navigated by brilliant stars. The staring at those stars
    should be continued for the future voyagers on the way.
    Of course, a great number of academic papers or commercial products
    can build a way of Virtual Reality. However we suggest a new relation
    between developmental projects and general public at on-site
    demonstrations. If a performed project has impact, technology and
    persuasive, it will move the general public and change the common sense.
    So, this means a revolution in the history of Virtual Reality.
    Please try to join today’s stardom with your exciting project.
    And share the activity from all over the world!

    Session Organizer: Akihiko SHIRAI, Ph.D ( ENSAM P&I Lab )

    We hope to accept your brilliant projects which can get over the current
    common sense of Virtual Reality and make changes to the current
    human-computer interfaces and Virtual Reality history.

    -Technology Demonstration
    -Interactive Arts
    -Entertainment VR
    -New Media Designs
    -New Game Systems
    -New Human Interfaces and Displays
    -Realtime Images
    ……..And any other “non genre” VR projects

    Submission deadline 1st Feb 2009
    -Description document (pdf/word)
    -Floor installation plan (pdf/ppt/jpg/gif)
    -Three Images for web
    -Video

    See also:
    Submission details
    http://www.laval-virtual.org/revolution/index.php?option=com_content&task=view&id=4&Itemid=12
    How to get accepted on ReVolution:
    http://www.laval-virtual.org/revolution/index.php?option=com_content&task=view&id=3&Itemid=11

    ReVolution has accepts two accept classes. If you get as “Invited”
    project, Laval Virtual gives your demonstration space, accommodation,
    flights and VIP tickets of Gala dinner.

    Winners 2008
    http://www.laval-virtual.org/revolution/index.php?option=com_content&task=view&id=6&Itemid=22

    Laval Virtual has an Awarding session to find a best VR project in this
    year. A large number of ReVolution projects might get win at the
    ceremony. In the past ReVolution, a project “Phantasm” (Mr. Takahiro
    Matsuo) had been accepted as “Welcome” by ReVolution, then it got “ACM
    SIGGRAPH Award” at Laval Virtual 2008 with the chance to shown at
    SIGGRAPH 2008 Los Angeles. If you are a student project, we recommend
    you to register to a student competition “Le Village de la Creation”,
    too. “The Dreaming Pillow” (ATI – Paris 8 Univ) had won at ReVolution
    and Competition at Laval Virtual 2008. Afterwards, they had get a chance
    to show in USA and Japan after ReVolution.
    http://www.laval-virtual.org/index.php?option=com_content&task=view&id=35&Itemid=49

    Laval Virtual Award 2008
    http://www.laval-virtual.org/index.php?option=com_content&task=view&id=127&Itemid=219&lang=en

    We are looking forward to see your great projects!
    Contact:
    http://www.laval-virtual.org/revolution/index.php?option=com_contact&Itemid=21

    Tue 2 Dec 2008

    Redirected Walking – Playing with your perception’s limits

    Published at 10:38   Category VR Applications  

    “Redirected walking allows users to walk through large-scale immersive Virtual Environments (IVEs) while physically remaining in a reasonable small workspace.”

    I have already talked a bit about this technique that I discovered back at 3DUI 2008 but which was introduced by UNC Chapel Hill. The principle is quite simple:  suppose you have a virtual world potentially infinite, and you want to physically walk this world (with a HMD for example). If you directly apply your real movements to your virtual self, you’ll run into the walls of your small room quite soon.

    What’s nice about VR is that it allows you to fool your senses. You’re already fooling your visual sense with incredible graphics (tell me you’re not using a 3DFX anymore!), so why not cheat your sense of movement, which relies a lot on visual cues?

    So instead of having a 1:1 mapping from real to virtual, we could modify the translation and rotation speed applied to the avatar. For example, a rotation of the user of 90° would result in a rotation of 100° or 80° in the VE. Same for translation, 1m in reality could result in 1.2m or 0.8m in the VE.

    rdw-4.jpg

    What’s more is that the opposite is also possible : suppose you walk in a straight line in the VE, we could have you walk along a curve in reality!

    At VRST 2008, Frank Steinicke from the Muenster University  and David Engel from the Max Planck Institute of Tuebingen each presented a paper on this topic at VRST 2008 : “Analyses of Human Sensitity to Redirected Walking” and “A Psychophysically calibrated controller for navigating through large environments in a limited free-walking space“.

    rdw-5b.jpg

    Dr Steinicke’s presentation’s focus was on the evaluation of the limits of redirected walking, whereas M. Engel’s  work is about how to dynamically change the gains of translation and rotation to fit the user’s path in a known environment.

    I’ve asked them both a few questions about their work :

    > What’s your background, interests and goals with this work ? 

    Frank Steinicke : My research interests include human-computer interaction with special consideration on VR, perception and cognition in computer generated environments and visualizations. The goal of this project is to provide intuitive interfaces for exploring VEs. As a matter of fact the most natural way of locomotion in the real world is walking. So, from a computer graphics perspective I am very interested in the question how real walking through CG environments can be realized if only a limited laboratory environments is available. I’m fascinated by the psychophysical phenomena, which allow to trick users, for example, in such a way that they unknowingly move on a path in the real world that differs from the path they perceive in the VE. Of course, we want to know how much we can trick users without them noticing discrepancies.

    David Engel : I’ve studied computer science in Tübingen and one of my main interest always has been virtual environments. I first came in contact with redirected walking during a presentation at APGV 2007. I found the idea of being able to explore infinite virtual worlds, like in the Holodeck from Star Trek, very compelling.

    > Where does this idea comes from? 

    Frank : Redirected walking (RDW) techniques have been used for a few years now. The main idea of redirected walking has been introduced by Sharif Razzaque et al. from the UNC at Chapel Hill. RDW is based on the fact that the visual sense dominates proprioceptive and vestibular senses. This has been shown much earlier, for example, by Alain Berthoz (who gave the keynote talk at VRST).

    David : The idea to determine the redirection factors dynamically, emerged from the problems I encountered during my early implementations of redirected walking. As soon as the paths through the virtual environments got longer the errors in the predicted positions accumulated to a point where the users kept running into the walls.

    rdw-1.jpg

    rdw-2.jpg

    rdw-3.jpg

    > What are, for you, the most interesting conclusions of your paper ?

    Frank : We have identified detection thresholds up to which humans can be redirected in such a way that they do not perceive any discrepancies. For example, we know that we can guide them on a circle with a radius of approximately 23 meters, while they believe that they walk straight in the VE. This guidance approach can be realized by injecting small rotations to one side, which enforce users to unknowingly walk on a circular arc in the opposite direction.

    Until now, we have only considered basic walking techniques such as forward movements and rotations. I think these concepts can be adapted to any kind of motion, such as walking on slopes and strafe otions. Furthermore, we have not addressed adaptation, which involves the question how far humans might adapt to redirected walking. Such adaptation has been considered before, for example, with left–right reversed vision, but not in the context of VR-based environments.

    David : With a dynamic optimization approach based on the minimization of a cost function we gain a large amount of robustness against deviations from the predefined path and can introduce much flexibility into the redirected walking approach. By adjusting the terms of the cost function we can adapt to new boundary conditions such as the current users sensitivity profile to redirection factors, route choices and multiple users.

    On the psychophysical side our next step will be to evaluate the user performance in navigation tasks when redirection techniques are applied. On the technical side we plan to support a more natural exploration of the virtual environment by allowing more complex and branching paths.

    > To what extent do you think we will be able to fool the body by taking advantage of its perception limitations? (including other senses than proprioception)

    Frank : The same concepts can be applied, for instance, for haptic feedback. Anatole Lecuyer from INRIA has shown that haptic feedback can be induced by visual stimuli, and Luv Kohli has examined how much the haptic feedback may vary from the visual stimuli. Another topic might be the question in how far time can be compressed or stretched in the virtual world and how the perception of time can be changed.

    David : The sensitivity of the perceptual system to sensory conflicts seems to depend largely on the amount of congruency between the modalities (e.g. if auditory cues are collocated with the real or virtual world) and the attention given to the different modalities. The extent to which the perception can be fooled might therefore be very task-dependent. In short term the usage of congruent multi-modal input and distractor tasks should allow for a much wider range in which the user can be fooled. In the long term redirected walking techniques could become more sophisticated by taking into account the sensitivities of the user to cue conflicts at different points during the gait cycle. Combining such ideas we should be able to overcome the spatial limitations of the available facilities.

    Thank you both for your time and interesting work!