Tuesday 20 May 2014

SixthSense vs Google Glass

In 2009, Pranav Mistry demonstrated some technology he called SixthSense, a wearable, gesture-controlled computer system whose visual output was a surface-mapped projector, allowing you to turn any surface into a computer or interact with the world as naturally as before, but now with seamless access to networked info everywhere. It was incredible. It was also a little bit awkward, worn around the neck on a lanyard. Though the software was released, the project has stalled. People can't build it.

In 2012, Google demonstrated Google Glass, their wearable computer. It is limited to a tiny display in one eye, though, and has to be controlled through voice and swipe gestures. However, with its front-facing camera, it could easily pick up on SixthSense gestures. Swapping the inward-facing eye display for an outward-facing projector, you could have a platform to run SixthSense, assuming the software issues could be overcome.

So we're still a few steps away from that world, but I think we're going to get there. In a way, it makes me mad that we don't have this yet, since Mistry was so close. He had devices, he had the software, now the software is released, but the devices aren't available, the software doesn't build any more and Mistry doesn't have time to work on it because he's busy with other projects.

Mokalus of Borg

PS - It doesn't have to be SixthSense. A similar project would do.
PPS - It's just that a lot of the work is already done for SixthSense itself. It would be a shame to waste it.

No comments: