Real world touchscreen interface for interactive documents and books
April 17, 2013 5:58 PM   Subscribe

Touchscreen interface for seamless data transfer between the real and virtual worlds - Fujitsu Laboratories has developed a next generation user interface which can accurately detect the users finger and what it is touching, creating an interactive touchscreen-like system, using objects in the real world.
posted by KokuRyu (18 comments total) 17 users marked this as a favorite
 


in the words of a great thinker of our age: SHUT UP AND TAKE MY MONEY.
posted by cendawanita at 6:04 PM on April 17, 2013 [3 favorites]


That is some version 1.0 Tony Stark shit right there.
posted by middleclasstool at 6:14 PM on April 17, 2013


Wow. Want.
posted by spitbull at 6:20 PM on April 17, 2013


Between this and, unfortunately, this week's bombing, it feels like the world of Caprica may not be that far away after all.
posted by limeonaire at 6:25 PM on April 17, 2013


Between this and Leap Motion, things are going to get really cool.
posted by xedrik at 6:27 PM on April 17, 2013


Google Glass no wait this thing and CAN WE JUST SLOW DOWN A MINUTE HERE PEOPLE???*

Can We Slow Down A Minute Here(tm) is not meant to contraindicate this user's near-feral need for this kind of future-saturated tech. Just bring it and shovel it down the funnel into my gaping mouth.
posted by Lipstick Thespian at 6:35 PM on April 17, 2013 [1 favorite]


"Penis. It's a penis. You're touching your penis."
posted by evidenceofabsence at 6:41 PM on April 17, 2013 [1 favorite]


The HCI researcher in me feels compelled to point out that ideas like this, a hybrid paper-digital interactive system, were actually pioneered in 1991 by Pierre Wellner at Rank Xerox EuroPARC. Wellner's famous Digital Desk paper and video (you can view the Digital Desk video on YouTube) were highly inspirational to a generation of computer science researchers. It's worth noting, though, that a lot of Wellner's ideas were mockups rather than actual implementation, since digital technologies weren't really ready at the time.

On the plus side, the Fujitsu Lab system looks fairly robust, and they've clearly put a lot of thought into having smooth user interactions. It seems like some really impressive engineering.
posted by jasonhong at 6:45 PM on April 17, 2013 [7 favorites]


what jasonhong said. One recent example is Natan Linder's LuminAR project, from 2010,
posted by honest knave at 7:19 PM on April 17, 2013


Lately I have been wondering why iOS doesn't have baked in OCR (via Camera Roll for the phone/iPod) and for the iPad via a live camera window in a corner. I would love to point the camera at a sign or page of text and have it sent to Notes or Google. Or point it at a map or painting or product and open a search on it on the same screen.

I know that there are kludgey third party apps for my phone that do this, but they are a multistep hassle and I can't get any use out of them in a timely manner (take photo, select photo, open in OCR app, wait for email, edit email, C&P email text to relevant app. Ugh). I would love to see this implemented system-wide, perhaps even as part of the keyboard (like the mic key).
posted by sourwookie at 7:34 PM on April 17, 2013 [1 favorite]


Microsoft Research has been doing similar projects with projectors and computer vision/augmented reality for quite some time; the PlayAnywhere projection system has a lot of the elements, especially when considered alongside things like DocuDesk (where the sidelinked annotations look very similar).

More recently they've been working on things like Beamatron and others- if you're interested in the area in general Andy Wilson has been involved in a lot of them and his papers are a good starting point.
posted by AaronRaphael at 7:54 PM on April 17, 2013


Wouldn't this be even easier with a smartphone camera? With a touchscreen it would know exactly what you were pointing at...point it at your laptop and it bluetooths over your desktop. and etc. With this wouldn't you need the whole camera/table/projector rig?
posted by sexyrobot at 8:24 PM on April 17, 2013


On the plus side, the Fujitsu Lab system looks fairly robust...

As long as you are willing to carry an overhead projector everywhere you go.
posted by DU at 5:16 AM on April 18, 2013


To follow up on what jasonhong said, I, together with two others, wrote a master's thesis evaluating (using activity theory) in-situ use of the prototype implementation (nee "Ariel") that came out of Xerox EuroPARC Cambridge lab back in 95-96 as a part of their involvement with the EuroCODE project.
The system featured a large digitizer, a projector, and ran on a Sun workstation.

It was developed for the use of inspection engineers working on the Danish Great Belt Bridge which was under construction at the time. The concept was that the engineers could bring paper drawings out in the field, annotate them, take pictures etc, and then later place the drawing on the digitizer, scan the barcode, and continue annotating the document, integrating their rich media and conversing with other off-site engineers on similar workstations. It was essentially a grand-grand-parent of what we see in this demo.

While technically impressive for its time (just as this demo is impressive today), it ultimately failed achieving what it had set out to do, as it turned out engineers did not use paper drawings in the way anticipated by the researchers. Paper was not a primary artefact for them, and anchoring interaction upon it was not what they needed. They liked the ability to annotate drawings just fine, but would have preferred it integrated into their CAD system.

I feel that this is really impressive, but that it is ultimately futile - paper is a poor substitute for an active surface like a (touch) screen, especially when you consider the bulk involved in the system. (Though it is a hell of a lot smaller than what we worked with back then).
posted by bouvin at 6:11 AM on April 18, 2013 [1 favorite]


...paper is a poor substitute for an active surface like a (touch) screen...

I agree, but reversed. I can interact with paper in a million ways that I can't with a touch screen1, plus it's cheaper, renewable, more portable and extremely accessible to everyone.

1To list just a few off the top of my head: folding, poking holes, marking with any instrument I happen to have handy, wetting, tossing casually, attaching to a surface and many others.
posted by DU at 6:21 AM on April 18, 2013


DU: Oh, absolutely, I'm not disputing that. Just that using it as a primary instrument to interact with a computer may not be the most optimal solution, when you could just use the computer.
posted by bouvin at 6:39 AM on April 18, 2013 [1 favorite]


Could this be used to add extra functionality to an existing analog interface, by mapping digital triggers onto the same controls?
posted by ZeusHumms at 10:23 AM on April 18, 2013


« Older (SLDFWYT)   |   Tomorrow Is Waiting Newer »


This thread has been archived and is closed to new comments