Join 3,512 readers in helping fund MetaFilter (Hide)


iPod Don't Touch
April 27, 2010 6:31 PM   Subscribe

A touch screen you don't touch. From Ishikawa Komuro Laboratory at the University of Tokyo, a gesture-controlled handheld device that responds thereminically to the motion of a finger held above the screen. Watch to the end for the remarkable 3-d painting app. From the people who brought you the pitching, batting, and dribbling robots. Previous Ishikawa awesomeness on Metafilter.
posted by escabeche (22 comments total) 8 users marked this as a favorite

 
I wish to say that I adore the term "thereminically", and pledge to use it in future conversation.
posted by Bora Horza Gobuchul at 6:52 PM on April 27, 2010 [7 favorites]


thereminically

Just wanted to bring attention to that word.
posted by Michael Pemulis at 6:53 PM on April 27, 2010


Damn you Bora Horza Gobuchul! You're faster on the draw.
posted by Michael Pemulis at 6:54 PM on April 27, 2010


There was actually a NES accessory that worked that way, the U Force.

I was just thinking the other day how cool it would be to have a cellphone screen that worked that way, but I didn't think just having a thermin style sensor would be precise enough. (my idea was to use radar :P)

And on top of that people now have true holographic (glasses free) 3D displays comming to handheld devices like the Nintendo 3DS Or this tablet panel from sharp (that anyone can add to their devices). The technology is a 3d barrier display (I just read about this stuff for the first time last night)

It's also possible that the motorola ming might use the same 3d panel

Imagine how awesome it would be to have a holographic display with a 3d distance sensing touch/hover screen.
posted by delmoi at 6:58 PM on April 27, 2010


I used the "how would David Foster Wallace describe this?" algorithm, of which I presume Michael Pemulis approves.
posted by escabeche at 6:58 PM on April 27, 2010 [3 favorites]


Neato!

The first thing that occurred to my nerdly brain was the episode of Babylon 5 where we're introduced to the Technomages. When one Technomage greets another in a dark corridor, he makes an arcane gesture in the air with his hand and glowing sigils appear, persisting for a moment. For some reason, this 3D input device made me think of that and how it will be possible to have complex hand and finger gestures mapped into segments of meaning and that these gestures can then be used to trigger technology, somehow.

The next thing I thought was that this might be useful for sign language, somehow, to allow a signed word or phrase to be input into a computer device.

Pretty cool!
posted by darkstar at 7:00 PM on April 27, 2010 [1 favorite]


During NAB, when everybody was talking about 3D, I predicted this would be the next frontier. I deserve some credit even though I didn't invent it.
posted by sswiller at 7:11 PM on April 27, 2010


It's a clever idea, but I wasn't all that impressed with it until they got to the 3D drawing program. That is cool -- way beyond anything you can do with any conventional input device.

I'm still not convinced that in-air typing will ever replace a keyboard, though. Having to type a) without tensile feedback and b) with just one finger makes it much too slow.
posted by vorfeed at 7:42 PM on April 27, 2010


oh my gosh darkstar! that sign language thing is seriously an awesome idea.

i went to elementary school with a large number of deaf kids so we had a sign language club where us hearing folk could learn the neat way the deaf kids talked. (i still remember how to sign most of the Sesame Street theme song.)

but your idea kind of ties together the arcane symbols i remember Raistlin made in the Dragonlance series with technology to sort of make the magic real.

i know some people in the deaf community have strong feelings about their culture with which i'm not entirely well-versed, but i think that your idea would be an incredibly awesome for deaf and hearing people to communicate, maybe even across cultures when signs are different.

of course, i'm just completely distracted by thoughts of that B5 thing becoming a reality and that i could draw in the air and have it mean something, even if just a glowing bit for a while. how cool that would be :)
posted by sio42 at 7:43 PM on April 27, 2010 [2 favorites]


Clearly, this will advance cat-baffling technology by leaps and bounds.
posted by mightygodking at 11:37 PM on April 27, 2010 [2 favorites]


It is neat - but using a finger as an input device seems to be something that we do best when we are pressing it up against something: writing on a steamed up window or using an iphone for example. We are not used to writing in mid air and we would get tired arms if we tried for too long.

Our intuitive ways of using a single finger in free space are to say things like "one", "up", "me" and "fuck you" - not as a substitute for using a writing surface and not as joystick control. What would interest me more would be if they were able to track all of our fingers at once - that would open up the way for more intuitive two handed control gestures such as "rotate it round this way", "stretch it", "bend it", etc. Also for sign language of all kinds.
posted by rongorongo at 12:06 AM on April 28, 2010


Those fingers looked a bit tense. Not sure how gentle on the wrist either.

But yay! One step closer towards Minority Report technology.
posted by polymodus at 12:13 AM on April 28, 2010


How about, instead of a camera capturing a finger near the screen, it's on the opposite side, and captures a hand interacting with graphics?
posted by unmake at 1:31 AM on April 28, 2010


the sign-language idea is awesome darkstar. (so is knowing to sign the lyrics to sesame street sio42)
posted by dabitch at 1:36 AM on April 28, 2010


Cool.

(In five years, Steve Jobs will always have invented this)
posted by rodgerd at 2:16 AM on April 28, 2010 [1 favorite]


So it's Minority Report, only smaller. I guess that solves the "tired arms" problem at the expense of creating a "tired fingers" problem.
posted by DU at 5:02 AM on April 28, 2010


During NAB, when everybody was talking about 3D, I predicted this would be the next frontier. I deserve some credit even though I didn't invent it.

Credit will go to Apple when they invent it, or to SJ himself, as rodgerd similarly points outs.
posted by juiceCake at 5:27 AM on April 28, 2010


I think this is pretty cool but (without knowing anything about technology) I think to me the coolest part is that this feels to me like a step towards amazing things. It doesn't seem that useful to me in itself -- as people have pointed out, that doesn't really seem like a good way to type and it would definitely take some adjusting to get used to pressing with no tactile feedback -- but I think it is something that begins to open up a lot of possibilities, like sio42's totally amazing sign language idea. Just think about all the stuff to which this could lead! Computers in which you can input information just by moving your head for people who are paralyzed from the neck down, more prosaically presentations in schools or meetings where you can interact more fully by motioning in front of a board or screen or even collaborate that way, all sorts of options for games, education -- there is so much to which this could lead and it is overwhelming but amazing.

Also, I think it's pretty cool to see something that feels so much like an important step towards remarkable things; it's so easy to miss the moment where something begins and although this is just one part it feels so full of potential that it's sort of breathtaking.
posted by Mrs. Pterodactyl at 6:12 AM on April 28, 2010


I fear the powerful Screen Protector Lobby and Screen Cleaner Mafia will kill off this innovation.
posted by chavenet at 7:03 AM on April 28, 2010


Automated understanding of sign language by a computer is a difficult problem, one that I feel safe in saying is harder than speech recognition has been. Most approaches to date have used computer vision. While this new device would likely help, it is important to know that signing is not just hand movements. For example, as I understand it, the difference between a question and a statement in ASL is expressed primarily in facial gestures.

All that aside, I think this touchless touchscreen idea is a good one, not because it lets us use lots of complicated 3D gestures all the time, but because it opens up a few more very simple, intuitive ones. I've got one in particular in mind: picking and placing. Right now touch screens are a bit clumsy when you have a collection of elements that you want people to be able to select (tap) and move (drag). The iPhone springboard (app icons) is a good example---to drag, you have to hold your finger down on an icon for a while. Wouldn't it be faster and easier if you could just "pick up" the icon and move it? What about selecting and moving around blocks of text?
posted by tss at 8:09 AM on April 28, 2010


ASL is a pre-existing and well-developed gesture-based method of expressing a language, and is an ideal candidate for motion-sensor input, at least for English-language users and probably for other languages that use more-or-less the same vowel and consonant set. Does anyone know how well ASL works for, say, Chinese or Arabic? Tonal languages such as Thai?
posted by aeschenkarnos at 2:47 PM on April 28, 2010


ASL is a pre-existing and well-developed gesture-based method of expressing a language, and is an ideal candidate for motion-sensor input, at least for English-language users and probably for other languages that use more-or-less the same vowel and consonant set. Does anyone know how well ASL works for, say, Chinese or Arabic? Tonal languages such as Thai?

A in ASL stands for "American" and the symbols have nothing to do with the pronunciation of the words. It doesn't matter what the spoken language sounds like.
posted by delmoi at 4:53 PM on April 28, 2010


« Older An essay on sleep and loss...  |  Working With Studs is a radio ... Newer »


This thread has been archived and is closed to new comments