• Pages

  • Recent Posts

  • Recent Comments

  • Archives

  • Thu 24 Apr 2008

    Sivic – VR Crime scene investigation

    Published at 12:44   Category VR Applications  

    Here’s a video of the Sivic application made by ESCIN students and presented at Laval Virtual 2008. It was created in collaboration with french police to train officers at crime scene investigation.

    YouTube Preview Image
    Wed 16 Apr 2008

    Laval Virtual 2008 – How Virtual is VR to your brain ?

    Published at 9:24   Category VR Applications  

    Lutz Jancke, from the Neuropsychology lab of Univeristy of Zurich, made an amazing presentation about the reality of VR to our brain. The short answer is for the brain, VR is just another reality; the brain only interprets inputs from its senses and experience, and VR is providing inputs that can be realistic enough to fool the brain. Moreover, VR experiences can shape your brain!

    The brain evolves during all your life

    The human brain is highly constructive, and constructs reality with the input it gets from the different senses. Perception of our world is a matter of interpretation by your brain of these different inputs.

    Studies have been conducted on twins that were separately fostered to know what is the influence of the genes on intelligence. It turns out that only 50% max of your intelligence comes from your genes; this means that 50% of intelligence comes from experience!!

    It also appears that much of the grey matter that makes us humans (visual sense, language etc..) is not determined by genes. In fact the brain is largely prepared to learn, it’s a giant learning machine that is able to learn during the whole lifetime of a human. During all your life your brain is restructured based on what you do, what you train at, for example music, juggling etc. But your capacities also decrease when you stop practicing.

    Grey matter density can increase with elder people too. Aging doesn’t prevent learning; the cognitive aspects of learning of an elder people is comparable to youths.

    Emotions

    There’s a zone in your brain that is a strong indicator of presence in a VE : the right-sided dorsal prefrontal cortex (DLPFC). The stronger this zone is activated, the less you feel present. This zone is in fact controlling your lymbic system which is responsible for your emotions. So if you feel emotions, the DLPFC will try to control it. If you feel emotions, it means you feel present in the world.

    But the DLPFC of a human matures very late, so children are not able to control their emotions. Kids show strong emotions, can’t control their pleasure and are in danger of getting addicted to anything, games or VR for example !

    That’s why parents have to play the role of the DLPFC by training the kids to restrain themselves; we have to replace the DLPFC by authority.

    It also seems that the pre-frontal cortex, responsible for the control of behaviour, self discipline and motivation only matures at 20 years old, so that would explain behavioral problems of teenagers. (wow I should tell that to my mum!)

    It’s also one of the first brain areas to degenerate with age if it’s not properly activated with specific tasks.

    Presence

    Studies have proved what we intuitively already knew; even on a simple screen, 1st person view is more immersive than 3rd person.

    Another experiment was conducted with a driving simulator. During the ride, a deer or a child would jump on the street in front of the car. It turns out that people didn’t get used to the kid jumping on the street, proving that the brain is working as if the situation was real!

    VR training and rehabilitation

    VR has a huge potential for training and rehabilitation even if you only take the motivational part; as VR is more interesting/fun, people will want to take the training or rehabilitation sessions whereas they get bored and don’t show up at traditional sessions.

    Motivation is the most important factor in learning, and VR is very motivationnal!

    Conclusion

    The brain constructs reality. It is remarkably plastic and matures late, and for him, VR can be real.

    The brain can also be shaped by VR experience.

    M. Janke states that The Matrix is completely possible; reality is already a virtual world. We interpret reality through the lens of our experience, and if properly created, a virtual environment can seem very real to the brain, with all the positive and negative aspects this can give to its creator.

    So let’s use that great possibility for the better!

    “I know kung-fu!”

    Tue 15 Apr 2008

    Laval Virtual 2008 – Tradeshow

    Published at 13:40   Category Virtual Reality  

    This year Laval Virtual was celebrating its 10th anniversary.

    The tradeshow is getting bigger and there was not enough room for all the exhibitors that wanted a booth!

    There was not many technical innovations, rather evolutions of existing principles. This tends to prove the point I was trying to make last year that we are not making the best out of the current hardware.

    There were much less HMDs than last year as if non intrusive displays (eg autostereoscopic screens or projectors) were preferred.

    I’ll probably make another post concerning the new hardware I’ve seen here and at IEEE VR later.

    Read on for more …

    Read more…

    Sat 12 Apr 2008

    Virtual Reality for condom use study

    Published at 10:22   Category VR Applications  

    Back from Laval! Lots of things to tell and show you, but that will require some time.

    In the meantime, this article talks about a study which “goal is to see if risky sexual decisions are based on environmental cues or personality to determine the best education approach.”

    By WJBC’s Colleen Reynolds

    An Illinois Wesleyan University assistant professor has landed the college’s largest-ever grant. It’ll be used to create a virtual reality program to research what influences a person’s decision to use a condom.
    The $1.2 million dollar grant from the National Institutes of Health will be used to place people in virtual social situations and study their reactions to different variables, such as a party’s atmosphere, a potential mate’s appearance, or geographic availability, to see whether they influence safer sex-related decisions. Assistant Psychology Professor Natalie Smoak says the research could have practical applications, especially in this world of Internet dating.

    Not much info about the techical side, just hope the condoms are real ;)

    Mon 7 Apr 2008

    Going to Laval Virtual 2008

    Published at 23:07   Category Virtual Reality  

     See you there ;)

    Tue 1 Apr 2008

    IEEE VR 2008, AR/MR

    Published at 18:01   Category Augmented Reality  

    From Monday 10th to Wednesday 12th March 2008 was the actual IEEE VR conference. See the complete program.

    It was harder to take notes because there was no power plug in this room, and Wi-fi was having a hard time, so sorry if the report here is less complete. Moreover we had to hold the Virtools booth during pauses, which was pretty exhausting and caused us to miss some interesting presentations.

    The first session was about Augmented/Mixed reality, and even during the other sessions, there was a lot of talk about AR! I’m wondering if this is because AR is trendy right now, but I’m pretty sure there are a lot of things to research in pure VR! Not that AR isn’t interesting, and of course there’s a lot of common ground, but it’s not my main field of interest. So maybe there should be a IEEE AR or a full AR session where all the AR specific topics are discussed ? Like markers /camera tracking, AR displays, AR applications etc.

    An impressive tracking system based on visual/inertial fusion was presented by Gabriele Bleser and Didier Stricker (“Advanced tracking through efficient image processing and visual-inertial sensor fusion”). It is very robust and didn’t seem to have any visible latency, but it requires a textured-CAD model of the environment to be able to use an illumination model. Everyone in the room was stunned and I believe it’s the only time the public spontaneously applauded in the middle of the presentation!

    Here are some summaries of papers.

    Massively Multiplayer Online World as Platform for AR experiences

    by Lang, MacIntyre, Jamard.

    We have seen a prototype of augmented reality interface to Second Life; you could see SL avatars in the real world, which is really nice! Even more, an avatar inside SL can record its performance in the Augmented Reality and see that video inside SL! This mixing of real and virtual world makes me dizzy!! That’s a really nice application.

    See http://arsecondlife.gvu.gatech.edu

    YouTube Preview Image

    Providing a wide field of view for AR

    by Seokhee Jeon, Gerard J. Kim.

    The first paper was about improving the field of view used for desktop AR to improve usability. By putting the camera on the user’s head and mosaicing (stitching) the views, the system can provide a proprioceptive match between the real and augmented world.
    Displaying the whole interaction space shows better performance and usability, and it also reduces searching time and cognitive load.
    But it seems that placing the camera in a fixed location close to the user’s head. The stiching errors, mainly due to motion blur, were the main concern of users but the mosaicing system can still be a good alternative if the camera cannot be fixed at the proper place or when it will be improved

    Capturing images with sparse informational pixels using projected 3d tags

    by Li Zhang, Neesha Subramaniam, Robert Lin, Ramesh Raskar, Shree Nayar.

    The goal of this team is to have tags in realworld to be seen by telephone cameras.
    Some challenges in barcode recognition are tag distance and inclination, environment illumination.
    Moreover, it requires attaching physical tags to the surface.
    What is proposed here is to project optical tag with a projector. This allows to use, instead of a spatial pattern, have a temporal pattern, or a spatio temporal pattern. Moreover, the tags are projected in the infrared spectrum so are not visible by human eye or regular camera.
    These tags allow to get information of real objects on cheap phone cameras with very limited computation power.

    See http://www1.cs.columbia.edu/CAVE/projects/photo_tags/
    Hear-through AR: using bone conduction to deliver spatial audio

    by Robert W. Lindeman, Haruo Noma, Paulo Gonçalves de Barros .

    See http://www.cs.wpi.edu/~gogo/hive

    The goal is to augment reality of the auditory sense, not only visually as in traditionnal AR. We need CG and real world sound occlusion and reflection. Why use a bone conduction headset ? Because the real world is not occluded or modified in any way. But headphone accuracy is probably better. So a lot of work still has to be done to improve the system, both on the sound generation and the hardware, but the results of the study are very interesting.

    Tue 1 Apr 2008

    IEEE VR 2008, Training and Virtual Humans Session

    Published at 18:01   Category VR Applications, Virtual Reality  

    From Monday 10th to Wednesday 12th March 2008 was the actual IEEE VR conference. See the complete program.

    Bigger Size Gallery.

    svgallery=ieee_vr_2008

    Training

    Again, for me one of the most interesting applications of AR/VR is training. See also Virtual Humans below.

    In the paper by John Quarles, Samsun Lampotang, Ira Fischler, Paul Fishwick and Benjamin Lok, from University of Florida, “Mixed Reality Merges Abstract and Concrete Knowledge” we have seen that Augmented Reality helps .. merge abstract and concrete world. That means that it’s perfect for training on real and complex machines by adding abstract information that helps understand the underlying mechanics. When the user operates the physical machine, the virtual model is also updated, an you can see the updated augmented information to understand the results of your actions.

    Virtual Humans

    I really enjoyed the Virtual Humans sessions, and particularly the “Virtual Human + Tangible Interface = Mixed Reality Human” paper by Aaron Kotranza and Benjamin Lok from University of Florida. Here’s the abstract of the paper :

    Virtual human (VH) experiences are receiving increased attention for training real-world interpersonal scenarios. Communication in interpersonal scenarios consists of not onlyspeech and gestures, but also relies heavily on haptic interaction –interpersonal touch. By adding haptic interaction to VH experiences, the bandwidth of human-VH communication can be increased to approach that of human-human communication.
    To afford haptic interaction, a new species of embodied agent isproposed – mixed reality humans (MRHs). A MRH is a virtual human embodied by a tangible interface that shares the same registered space. The tangible interface affords the haptic interaction that is critical to effective simulation of interpersonal scenarios. We applied MRHs to simulate a virtual patient requiring a breast cancer screening (medical interview and physical exam). The design of the MRH patient is presented. This paper also presents the results of a pilot study in which eight (n = 8) physician-assistant students performed a clinical breast exam on the MRH patient. Results show that when afforded haptic interaction with a MRH patient, users demonstrated interpersonal touch and social engagement similarly to interacting with a human patient.

    What this all means is that interaction with virtual humans help change real life behavior in the real world. Self perception is changed. ixed reality humans, by adding the dimension of touch give the trainee more empathy, social engagement, through interpersonnal touch. Touch drives the interaction. Users interacted with the MVH as if they were real humans, so what they learn in this training is directly transferable to the real world.

    Honors

    As for the honors, Bernd Fröhlich got the VR Technical Achievement Award, and Bowen Loftin got the VR Career Award.