- VR Go
- VR Lemmings
- Lessons from the VR field #1
- Creating VR games – the fundamentals
- VRLux – PostMortem
- VRST 2012 Keynote: Improving the VR experience
- A history of VR (in French)
- An introduction to Immersive Virtual Reality (update)
- Existing VR games ?
- Some news
- New immersive cubes
- Quick links #10
- Body representation in VR
- Razer Hydra – Cheap magnetic tracking
- #2: Eric Hodgson on spatial perception, redirected walking & the split between Old VR vs. New VR | Judderverse VR on Redirected Walking – Playing with your perception’s limits
- #100: Sébastien Kuntz on Virtual Reality Presence & Lessons from 13 years in VR | Judderverse on A Definition of VR
- Near Field VR: Stop-Motion-Kunst in Virtual Reality | VRODO | Deutschlands News-Seite für Virtual Reality und Augmented Reality on Redirected Walking – Playing with your perception’s limits
- Valve’s Entry to VR Targets the Standing Experience | The Games Pusher on Redirected Walking – Playing with your perception’s limits
- 'Holocommander' Oculus Rift RTS Prototype Emerges from Global Game Jam on HoloCommander
- HoloCommander | 3D/VR/AR on HoloCommander
- Episode 2: Eric Hodgson on spatial perception, redirected walking & the split between Old VR vs. New VR | Voices of VR Podcast on Redirected Walking – Playing with your perception’s limits
- Hasan khedr on Lessons from the VR field #1
- Global Game Jam – Oculus Rift Projekte | Bloculus Das deutsche Oculus Rift News Blog on VR Lemmings
- VR Lemmings | VRMonkey on VR Lemmings
- September 2015
- January 2015
- January 2014
- January 2013
- November 2012
- August 2012
- July 2012
- December 2011
- November 2011
- September 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- January 2010
- December 2009
- November 2009
- October 2009
- September 2009
- August 2009
- July 2009
- June 2009
- May 2009
- April 2009
- March 2009
- February 2009
- January 2009
- December 2008
- November 2008
- October 2008
- September 2008
- August 2008
- July 2008
- June 2008
- May 2008
- April 2008
- March 2008
- February 2008
- January 2008
- December 2007
- November 2007
- October 2007
- September 2007
- August 2007
- July 2007
- June 2007
- May 2007
- April 2007
- March 2007
- February 2007
- January 2007
- December 2006
- November 2006
- October 2006
- September 2006
- August 2006
- July 2006
- June 2006
- May 2006
- April 2006
- March 2006
- February 2006
- January 2006
- December 2005
- November 2005
- October 2005
- September 2005
- August 2005
- July 2005
- June 2005
- May 2005
- April 2005
- March 2005
- February 2005
- January 2005
- December 2004
- November 2004
- October 2004
- September 2004
Lutz Jancke, from the Neuropsychology lab of Univeristy of Zurich, made an amazing presentation about the reality of VR to our brain. The short answer is for the brain, VR is just another reality; the brain only interprets inputs from its senses and experience, and VR is providing inputs that can be realistic enough to fool the brain. Moreover, VR experiences can shape your brain!
The brain evolves during all your life
The human brain is highly constructive, and constructs reality with the input it gets from the different senses. Perception of our world is a matter of interpretation by your brain of these different inputs.
Studies have been conducted on twins that were separately fostered to know what is the influence of the genes on intelligence. It turns out that only 50% max of your intelligence comes from your genes; this means that 50% of intelligence comes from experience!!
It also appears that much of the grey matter that makes us humans (visual sense, language etc..) is not determined by genes. In fact the brain is largely prepared to learn, it’s a giant learning machine that is able to learn during the whole lifetime of a human. During all your life your brain is restructured based on what you do, what you train at, for example music, juggling etc. But your capacities also decrease when you stop practicing.
Grey matter density can increase with elder people too. Aging doesn’t prevent learning; the cognitive aspects of learning of an elder people is comparable to youths.
There’s a zone in your brain that is a strong indicator of presence in a VE : the right-sided dorsal prefrontal cortex (DLPFC). The stronger this zone is activated, the less you feel present. This zone is in fact controlling your lymbic system which is responsible for your emotions. So if you feel emotions, the DLPFC will try to control it. If you feel emotions, it means you feel present in the world.
But the DLPFC of a human matures very late, so children are not able to control their emotions. Kids show strong emotions, can’t control their pleasure and are in danger of getting addicted to anything, games or VR for example !
That’s why parents have to play the role of the DLPFC by training the kids to restrain themselves; we have to replace the DLPFC by authority.
It also seems that the pre-frontal cortex, responsible for the control of behaviour, self discipline and motivation only matures at 20 years old, so that would explain behavioral problems of teenagers. (wow I should tell that to my mum!)
It’s also one of the first brain areas to degenerate with age if it’s not properly activated with specific tasks.
Studies have proved what we intuitively already knew; even on a simple screen, 1st person view is more immersive than 3rd person.
Another experiment was conducted with a driving simulator. During the ride, a deer or a child would jump on the street in front of the car. It turns out that people didn’t get used to the kid jumping on the street, proving that the brain is working as if the situation was real!
VR training and rehabilitation
VR has a huge potential for training and rehabilitation even if you only take the motivational part; as VR is more interesting/fun, people will want to take the training or rehabilitation sessions whereas they get bored and don’t show up at traditional sessions.
Motivation is the most important factor in learning, and VR is very motivationnal!
The brain constructs reality. It is remarkably plastic and matures late, and for him, VR can be real.
The brain can also be shaped by VR experience.
M. Janke states that The Matrix is completely possible; reality is already a virtual world. We interpret reality through the lens of our experience, and if properly created, a virtual environment can seem very real to the brain, with all the positive and negative aspects this can give to its creator.
So let’s use that great possibility for the better!
“I know kung-fu!”
This year Laval Virtual was celebrating its 10th anniversary.
The tradeshow is getting bigger and there was not enough room for all the exhibitors that wanted a booth!
There was not many technical innovations, rather evolutions of existing principles. This tends to prove the point I was trying to make last year that we are not making the best out of the current hardware.
There were much less HMDs than last year as if non intrusive displays (eg autostereoscopic screens or projectors) were preferred.
I’ll probably make another post concerning the new hardware I’ve seen here and at IEEE VR later.
Read on for more …
Back from Laval! Lots of things to tell and show you, but that will require some time.
By WJBC’s Colleen Reynolds
An Illinois Wesleyan University assistant professor has landed the college’s largest-ever grant. It’ll be used to create a virtual reality program to research what influences a person’s decision to use a condom.
The $1.2 million dollar grant from the National Institutes of Health will be used to place people in virtual social situations and study their reactions to different variables, such as a party’s atmosphere, a potential mate’s appearance, or geographic availability, to see whether they influence safer sex-related decisions. Assistant Psychology Professor Natalie Smoak says the research could have practical applications, especially in this world of Internet dating.
Not much info about the techical side, just hope the condoms are real
From Monday 10th to Wednesday 12th March 2008 was the actual IEEE VR conference. See the complete program.
It was harder to take notes because there was no power plug in this room, and Wi-fi was having a hard time, so sorry if the report here is less complete. Moreover we had to hold the Virtools booth during pauses, which was pretty exhausting and caused us to miss some interesting presentations.
The first session was about Augmented/Mixed reality, and even during the other sessions, there was a lot of talk about AR! I’m wondering if this is because AR is trendy right now, but I’m pretty sure there are a lot of things to research in pure VR! Not that AR isn’t interesting, and of course there’s a lot of common ground, but it’s not my main field of interest. So maybe there should be a IEEE AR or a full AR session where all the AR specific topics are discussed ? Like markers /camera tracking, AR displays, AR applications etc.
An impressive tracking system based on visual/inertial fusion was presented by Gabriele Bleser and Didier Stricker (“Advanced tracking through efficient image processing and visual-inertial sensor fusion”). It is very robust and didn’t seem to have any visible latency, but it requires a textured-CAD model of the environment to be able to use an illumination model. Everyone in the room was stunned and I believe it’s the only time the public spontaneously applauded in the middle of the presentation!
Here are some summaries of papers.
Massively Multiplayer Online World as Platform for AR experiences
by Lang, MacIntyre, Jamard.
We have seen a prototype of augmented reality interface to Second Life; you could see SL avatars in the real world, which is really nice! Even more, an avatar inside SL can record its performance in the Augmented Reality and see that video inside SL! This mixing of real and virtual world makes me dizzy!! That’s a really nice application.
Providing a wide field of view for AR
The first paper was about improving the field of view used for desktop AR to improve usability. By putting the camera on the user’s head and mosaicing (stitching) the views, the system can provide a proprioceptive match between the real and augmented world.
Displaying the whole interaction space shows better performance and usability, and it also reduces searching time and cognitive load.
But it seems that placing the camera in a fixed location close to the user’s head. The stiching errors, mainly due to motion blur, were the main concern of users but the mosaicing system can still be a good alternative if the camera cannot be fixed at the proper place or when it will be improved
Capturing images with sparse informational pixels using projected 3d tags
The goal of this team is to have tags in realworld to be seen by telephone cameras.
Some challenges in barcode recognition are tag distance and inclination, environment illumination.
Moreover, it requires attaching physical tags to the surface.
What is proposed here is to project optical tag with a projector. This allows to use, instead of a spatial pattern, have a temporal pattern, or a spatio temporal pattern. Moreover, the tags are projected in the infrared spectrum so are not visible by human eye or regular camera.
These tags allow to get information of real objects on cheap phone cameras with very limited computation power.
Hear-through AR: using bone conduction to deliver spatial audio
The goal is to augment reality of the auditory sense, not only visually as in traditionnal AR. We need CG and real world sound occlusion and reflection. Why use a bone conduction headset ? Because the real world is not occluded or modified in any way. But headphone accuracy is probably better. So a lot of work still has to be done to improve the system, both on the sound generation and the hardware, but the results of the study are very interesting.
From Monday 10th to Wednesday 12th March 2008 was the actual IEEE VR conference. See the complete program.
Again, for me one of the most interesting applications of AR/VR is training. See also Virtual Humans below.
In the paper by John Quarles, Samsun Lampotang, Ira Fischler, Paul Fishwick and Benjamin Lok, from University of Florida, “Mixed Reality Merges Abstract and Concrete Knowledge” we have seen that Augmented Reality helps .. merge abstract and concrete world. That means that it’s perfect for training on real and complex machines by adding abstract information that helps understand the underlying mechanics. When the user operates the physical machine, the virtual model is also updated, an you can see the updated augmented information to understand the results of your actions.
I really enjoyed the Virtual Humans sessions, and particularly the “Virtual Human + Tangible Interface = Mixed Reality Human” paper by Aaron Kotranza and Benjamin Lok from University of Florida. Here’s the abstract of the paper :
Virtual human (VH) experiences are receiving increased attention for training real-world interpersonal scenarios. Communication in interpersonal scenarios consists of not onlyspeech and gestures, but also relies heavily on haptic interaction –interpersonal touch. By adding haptic interaction to VH experiences, the bandwidth of human-VH communication can be increased to approach that of human-human communication.
To afford haptic interaction, a new species of embodied agent isproposed – mixed reality humans (MRHs). A MRH is a virtual human embodied by a tangible interface that shares the same registered space. The tangible interface affords the haptic interaction that is critical to effective simulation of interpersonal scenarios. We applied MRHs to simulate a virtual patient requiring a breast cancer screening (medical interview and physical exam). The design of the MRH patient is presented. This paper also presents the results of a pilot study in which eight (n = physician-assistant students performed a clinical breast exam on the MRH patient. Results show that when afforded haptic interaction with a MRH patient, users demonstrated interpersonal touch and social engagement similarly to interacting with a human patient.
What this all means is that interaction with virtual humans help change real life behavior in the real world. Self perception is changed. ixed reality humans, by adding the dimension of touch give the trainee more empathy, social engagement, through interpersonnal touch. Touch drives the interaction. Users interacted with the MVH as if they were real humans, so what they learn in this training is directly transferable to the real world.