The Friday Fillip 2

As a counterpart to Simon’s post below on applying modern technology to 16th century information, I thought I would offer Pranav Mistry’s SixthSense Technology described as “a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.”

Doesn’t sound like much?

Watch the video of his explanations here from the TED Conferences page. I mean watch it now (a colleague just made me aware of it). It is one of the most breathtaking things I have seen and was completely shocked that I was not aware of this already (nor does it appear to have been discussed on SLAW previously). It seems so unbelievable as to be a hoax (I assume it is not).

Mistry is a Research Assistant at MIT’s Fluid Interfaces Lab.

Rather than flipping through digitized pages of Shakespeare’s original folios on your computer (which is cool enough), imagine doing so walking down the street using a scrap sheet of paper as your screen. Or imagine my typing this blog post using the wall of my local coffee shop as my keyboard. Watch the video and you will see what I mean.

Comments

  1. Very exciting! Someone mentioned the SixthSense device at Internet Librarian last year. Very interesting to hear from the designer how he came up with this and other ideas. Inspiring.

  2. This is fabulous, Ted. Thanks for telling us about it. SixthSense seems a couple of steps beyond wearable computing, developed by Steve Mann at U of Toronto. This stuff really demonstrates the old Arthur Clarke adage that a sufficiently advanced technology is indistinguishable from magic.

  3. The current ultimate in portable touchscreens. Do we need a new term for touch screens one doesn’t need to touch?

    I like this:

    The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.

    That means the software should, eventually, be able to provide, for example, a playable virtual piano keyboard and, in conjunction with other sofware, translate the finger and key movements into sounds. I assume that, in time, the software and hardware components will be able to track the speed and abruptness of one’s finger movements, translating that into the analogus key stroke if one had a real key under one’s finger.

  4. And he is going to open source the software!

  5. You should really read more Popular Science – this is 6 months old already ;)

  6. Well, it is new to most in this audience. And the TED video with Mistry that Ted mentions is new as well. ;)