We've evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to perceive information about it; that information helps us make decisions and chose the right actions to take. But arguably the most useful information that can help us make the right decision is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen.
- SixthSense, an interface developed by the Indian computer scientist and inventor Pranav Mistry (33, Vice President of Research at Samsung) bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. SixthSense frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.
Above and below: SixthSense is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.
The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques.
Some practical uses of the SixthSense interface:
- The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system.
- The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements.
- The drawing application lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. SixthSense also recognizes user’s freehand gestures (postures). For example, the SixthSense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the 'framing' gesture. The user can stop by any surface or wall and flick through the photos he/she has taken
Through the telephony application, SixthSense projects a telephone keyboard on any surface (including the user's hand, for example), allowing to dial numbers and make calls (in this case the user is provided with headset and microphone).
Above: the inventor Pranav Mistry shows how SixthSense can turn a hand into phone. Below: reading newspapers can be supplemented by videos about the news you are reading at that moment!
Shopping with SixthSense:
- All the advanced interactions with the purchaser, as QR codes or downloadable coupons via mobile phone, web sites or touch screens, seem already outdated if we look at what SixthSense can do!
- All you have to do is to pick an item from the shelf and to 'frame' it. SixthSense projects on the item itself additional information such as: sustainability of the production process, the opinions expressed by other users,recommendations for use and a lot more! All this by simply taking the object in your hand, as you would commonly, without having to fiddle with other devices.
The current prototype system costs approximately $350 to be built. See the video below about the use of SixthSense in the everyday life!