Ever want to control something with your mind? Rearrange your bedroom by only thinking about it? The University of Michigan 3D Lab is using EEG technology combined with full body gestures to bring a new level of immersion to interactive applications. Using the EPOC Emotiv headset one can read instant EEG patterns from 14 areas of the brain. This information can be used to tell special software whether you are smiling, stressed out, or thinking about grapes. The UM3D Lab is then using this information to trigger events such as movement through a virtual world or interactions with 3D objects in their fully immersive virtual reality D.E.N. (http://www.youtube.com/watch?v=G3tgc4-pibc)
In addition to reading your mind, the Microsoft Kinect is used to capture gestures and body movements without the use of reflective markers or skin tight lycra suits. Traditionally, video cameras only captured 2D information. The camera had no idea which objects were closer than others. Using a depth sensing camera the Kinect can do some simple motion capturing and figure out where the user’s hands, arms, etc. are in 3D space. This information can then used to track gestures, trigger events, and allow the user to interact with a virtual world much more naturally. (http://www.youtube.com/watch?v=XMl4Q2smpPk) Much like with the Emotiv headset, these high level natural user interfaces (NUI) allow for easier interaction with complex virtual worlds and even opens doors to those with limited physical ability.