The basic idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time. Augmented reality blurs the line between what’s real and what’s computer generated by enhancing what we see, hear, feel and smell. Different from virtual reality, augmented reality adds to the natural world as it exists, enhancing one’s perception of reality. While many devices like video games and cell phones have been driving the development of augmented reality, other companies have recently started delving into this technology. Augmented reality is changing the way we view the world and devices like SixthSense are making it a reality for many.
Some of the most exciting augmented-reality work is taking place in research labs at universities around the world. At the TED conference, Pattie Maes and Pranav Mistry presented their augmented-reality system, which they call SixthSense. The SixthSense device turns any surface into an interface that you can physically interact with. You can watch a video, surf the internet, or make a phone call on virtually any surface where and when you want them. This prototype even lets you take a photograph by simply holding your hand in the air and making a “framing” gesture. The SixthSense prototype is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.
The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant-like wearable mobile device. The projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects to be used as interfaces; while the camera recognizes and tracks user’s hand gestures and physical objects using computer-vision bases techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.
The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of both the system and the augmented reality technology. The map application lets the user navigate a map display on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application also lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. SixthSense also recognizes the user’s gestures. SixthSense will capture an image of what the user is looking at when it detects the “framing” gesture. The user can then stop by any surface or wall and flick through the photos he/she has taken. The gesture of drawing a circle on the user’s wrist projects an analog watch. SixthSense also lets the user draw icons or symbols in the air using the movement of the index finger and then recognizes those symbols as interaction instructions. For example, drawing a magnifying glass takes the user to the map and drawing an @ symbol lets the user check his mail.
Augmented reality is truly a fascinating technology and I believe devices like SixthSense are on their way to becoming the computers of the future. Although SixthSense is currently just a prototype, over time, I believe it will become the new “must-have” device.