I’m wearing a camera, just a simple web cam, a portable, battery-powered projection system with a little mirror. These components communicate to my cell phone in my pocket which acts as the communication and computation device. And in the video here we see my student Pranav Mistry, who’s really the genius who’s been implementing and designing this whole system. And we see how this system lets him walk up to any surface and start using his hands to interact with the information that is projected in front of him. The system tracks the four significant fingers. In this case, he’s wearing simple marker caps that you may recognize. But if you want a more stylish version, you could also paint your nails in different colors.

And the camera basically tracks these four fingers and recognizes any gestures that he’s making so he can just go to, for example, a map of Long Beach, zoom in and out, etc. The system also recognizes iconic gestures such as the “take a picture” gesture, and then takes a picture of whatever is in front of you. And when he then walks back to the Media Lab, he can just go up to any wall and project all the pictures that he’s taken, sort through them and organize them, and re-size them, etc., again using all natural gestures.