The future of gaming
February 5, 2005 11:26 PM   Subscribe

The future of gaming (video stream). Check out the demonstration by Total Immersion (French). This is the future, baby!
posted by knutmo (26 comments total)
 
First, lets get the snark out of the way...

It's an embedded Windows Media file and is a bit hard to get going on some Firefox installations. If you can get it up, its really worth the effort.

Its obviously a work in progress, they are having issues with depth finding real world objects against the generated ones.

That being said, this is really really cool. I see so many engineering, public policy, gaming, and design possibilities here, my mind boogles.

If they can get a wearable head mounted display that would allow one to see the virtual items inline with ones point of view, it gets even crazier.

Doctors examining a patient, and have 3d MRI and Xrays composited right in with the patients chest.

Aircraft mechanics could overlay wiring diagrams.

I could just go on and on. This technology is evolutionary, not a revolution. But, I think that computing power, display sophistication and real time rendering have reached the tipping point for stuff like this. I expect this to be ubiquitous relatively quickly.
posted by PissOnYourParade at 11:45 PM on February 5, 2005


If we could get the form factor on the wearable displays down to something around Clark Kent glasses, and say a Palm sized belt pack with something along like Bluetooth, the applications are endless...

We'll even completely forget the gaming applications, because as per POYP's remarks, I do see the medical scanner (almost) out of the Star Trek movies.

Or imagine the applications combined with GPS. Never get lost again, as you have a compass and directional indicator in your field of view all the time.

And I do agree that with the advanced computer power we have today, combined with technologies like MEMS, that this could be a link to a chance to be able to forge your own visual reality...
posted by Samizdata at 12:10 AM on February 6, 2005


Holy shit.
posted by Yelling At Nothing at 12:15 AM on February 6, 2005


You know, what I see in all this. Advertising and marketing.

"See Pokemon stand right up beside you!"

But perhaps it will have some real use as well.
posted by TwelveTwo at 12:18 AM on February 6, 2005


More proof of the coming Singularity.
posted by iamck at 12:18 AM on February 6, 2005


I thought of Vernor Vinge immediatley too, and I haven't even read his stuff. Anyway it would be nice to have a wearable display that could block advertising.
posted by bobo123 at 12:29 AM on February 6, 2005


This reminds me heavily of a series of short stories by Bruce Sterling in the mid-to-late nineties ("Deep Eddy", "Bicycle Repairman", and "Takamalakan", to be specific) where technology has evolved to the point where everybody wears "spex" -- which are basically eye-wearable computing devices that can also perform augmented-reality functions (e.g., they can easily be hooked up to an IR scanner, or a remote camera, or even stuff like checking someone's face against a database and reminding you of their name).

As a CG nerd, I can definitely see the limitations of this particular implementation (especially stuff like: the objects the man was holding in his hand were being obscured by his other hand, and didn't cast shadows). The technology itself is nothing new, as PissOnYourParade mentioned, but rendering tech has gotten so good and cheap lately that this sort of thing really begins to become feasible on a budget.

Actually, about the shadow thing: the CG objects on the table cast shadows, but the objects-in-hand didn't. Assuming that the table was predefined in the system as a "solid" object, and the objects-on-table were simply casting on that, what's to stop the object-in-hand from doing the same thing? Was the lighting just set up differently in the CG world than in the real one? </nerd>
posted by neckro23 at 1:08 AM on February 6, 2005


Direct link.
posted by Civil_Disobedient at 1:45 AM on February 6, 2005


Sorry. Direct link.
posted by Civil_Disobedient at 1:52 AM on February 6, 2005


Vinge. And later Gibson's Virtual Light glasses as well, however stolen or borrowed the idea is.

They already make off the shelf eyeglass-integrated head-up-displays that are little more than a thin, efficient prism in the center or off to the side of your glasses that reflect the primary display source into your field of view. It looks like a chunky bifocal or something.

I was so intent on owning such a pair of magic glasses I was rather earnestly - if haphazardly and furtively - attempted to just gather the right people together and interest them enough in it to make it all happen.

It shouldn't be surprising that I still don't own a pair of magic happy information glasses.

As part of this mania, 5-6 years ago I asked a firmware and chip design guru/wizard I knew if modern DSP techniques had evolved to the point where they could be programmed to recognize - out of a given, finite set - one species of tree from another in a forest, and he just winked and said "Sure, I just need to kick my aircraft silhouette recognition algorithms up a notch or two. No sweat." (He did a fair amount of custom and prototype military, radio and space telecom hardware stuff) He went on to detail how he'd tackle such a tree-detection algorithm, how to count bifurcations and branching from various angles and fields of view, how to make shortcuts in the matching process like comparing silhouettes to libraries of silhouette profiles, how to deal with variable lighting.

He also agreed if you could do trees, and accomplish an augmented reality task like attaching mouse-over or focus tags to targets that would display meta-information like the species, or even the height/width of the tree from interpolation of stereo images, or outlining/defining the object, it wouldn't be a far stretch to be able to sense and identify animals. People. Plants. Locations. After such fuzzy targets, man-made objects and indoor or urban locations would be much easier for the most part.

(Keep in mind this was the sort of guy that didn't write software and throw a bunch of off-the-shelf CPU cycles at it. Almost everything he did was specific-task(s) hardware DSP circuits and other signal-stream type circuits. It's a whole different world compared to the PC world, with very narrow and constrained tasks usually.)

The technology is really, really close to becoming deployable. There's been wearable computers and displays for a decade or more, and the applications are often quite similar to augmented reality or virtual light, even if it's only software based service manuals or wireless data terminal applications with no real-world interaction.

As always, the more instructions per second, the more storage space and bandwidth the better, but it seems like this is certainly something that needs a lot more software and research.

The first applications are going to be very narrow and specific and probably in controlled environments, but then as more and more modules and/or topics are covered, it's simply a matter of space or bandwidth to load as many modules as possible, put a user interface on it and give it some modality to switch between various tasks and modes, and we're well on our way to at least partially omniscient data glasses.

The thing that seems weird to me about all of this is that it seems like this should or could have already happened, but it seems like the right Sci-Fi stories just haven't been read by the right engineers, and the right product designers and marketers. It could easily surpass and/or replace mobile phones as the next big thing.
posted by loquacious at 2:07 AM on February 6, 2005


Here's a detailed Scientific American article on the topic of augmented reality. This idea has been around for a long time, but I feel its only an intermediate step in the direction of complete digital subsumption of the human experience. What do I mean by that? Well, its hard not to sound like some transhumanist dreamer, but what I mean is the rehosting of our minds on a digital substrate, at which time all sensory experience, both internal and external, will be amenable to real-time manipulation.
posted by Meridian at 2:34 AM on February 6, 2005


I'm sure I've mention similar technology before by Human Interface Technology New Zealand that use handheld "glasses" to see and interact with. I meet one of their developers a few years ago, apparently the machines they were using back then were similarly spec to that of the xbox (they had even approached Microsoft about the possibility of creating an add on for the xbox, but were turned down). From the video it doesn't seem to be that much further along than the demo I saw 2 years ago, which is a shame since its so amazing to play with.
posted by X-00 at 2:51 AM on February 6, 2005


I don't understand...surely this is fake? What the hell is being projected on to? You can't just projet onto air...
posted by Orange Goblin at 4:03 AM on February 6, 2005


Orange Goblin: do you see that screen in the background behind the presenters? The video being projected on that is what we're watching.
posted by nmiell at 4:28 AM on February 6, 2005


Orange Goblin: the video is being output by the computer. These aren't holographic displays!

Anyway, weren't Virtual Reality helmets supposed to be the future of gaming in the early 90s?
posted by crayfish at 4:35 AM on February 6, 2005


crayfish:

You mean those huge, funky 8000 pound things?
posted by Samizdata at 6:18 AM on February 6, 2005


For a version of augmented reality you're more likely to use in the next ten years, look here. I have a lot more fun imagining possiblities for this one, honestly. What if you could hold a window, in your hand, that showed you arrows pointing to various parts of machines in lieu of instruction? What if you could hold an updating minimap, or a device that projected names over the heads of people you'd already met at a conference? What if you could play a strategy game on a schoolground using the actual terrain, with little army men only you and your opponent could see using your PDAs?

A friend of mine said she couldn't think of any possible application for handheld AR, and I wanted to punch her.
posted by dougunderscorenelso at 8:11 AM on February 6, 2005


dougunderscorenelso: I think you have some anger issues to work out :P
posted by delmoi at 8:51 AM on February 6, 2005


I'LL PUNCH YOU TOO
posted by dougunderscorenelso at 10:00 AM on February 6, 2005


My friend at georgia tech works in AR, Augmented Reality.

He's helped build DART (Designers Augmented Reality Toolkit). It works inside of Director 8 / Director MX. It's made for designers, not developers, so it tries to handle the messy stuff for you, including figuring out how to rotate this object as the camera moves around it and the like.

You can download it here.
posted by zpousman at 11:05 AM on February 6, 2005


We can make virtual helicopters fly over the audience, but we can't improve Windows Media?

No really, I look forward to staging WWII battles on my living room floor. Also, you could bring one of those mini-RC games to life in your own house. I'm digging it. Thanks.
posted by fungible at 1:22 PM on February 6, 2005


Has the technology been licenced to any of the Sony/Nintendo/Microsofts yet? This is EyeToy to the power of EyeToy!
posted by channey at 1:27 PM on February 6, 2005


That is cute and all, with the tanks going over the table's real-world topography, but you'd think they could have come up with some better demonstrations.

e.g., the car, why couldn't they have brought in some sort of .. model car, and overlayed diagrams of the inside of the car on that? Picked up the car, moved it around, with the diagrams always remaining synchronized -- that would have really impressed me.

But instead they showed some woman in a helicopter.

The flower was the best thing. They should have just stuck with that and the tanks.
posted by blacklite at 7:18 PM on February 6, 2005


See how the camera never moves? And how nothing that their 3D "projections" interact with ever moves? (Well, other than his hand, which is holding a sensor of some sort connected back to the system by wire.) Their system is not recognizing anything, I believe. It's been told where everything is relative to the camera beforehand. As far as I can tell, they are not doing any machine vision / perception.

So blacklite, they couldn't move the model car around without putting a position sensor (wired like the one in his hand) in it.

What they have (again, AFAICT) is a system that can draw 3D objects onto a video stream based on the orientation of the camera in real-time. It has 3D models of the table and buildings already, and it knows where they are relative to the camera.

I'm not saying it isn't cool, but just that it's a far cry from magical glasses. As impressive as it is, I think that what they've done is the easy part. They skipped the hard part by telling the computer what it was looking at. Next up is to have the computer figure it out on its own.
posted by whatnotever at 11:02 PM on February 6, 2005


Blah, I saw this kind of technology when I worked at NCSA back in 2000, except theirs had augmented reality glasses with a camera for each eye.

It would take a glyph, say the letter "H", which you could hold as a notecard in your hand, and then superimpose a 3d object over it, accounting for the size and rotation of the glyph relative to the camera's eye.

and whatnotever, what you have here is like what you see when you are moving about in reality. The way objects are oriented to your eyes is based off the orientation of your eyes "or camera".

If you are close up to an apple with your head turned upside-down, the apple is upside-down, as well as the table you're sitting on.
posted by monaco at 12:21 AM on February 7, 2005


I didn't think they were doing any vision/perception stuff, ... but I guess I didn't really think too hard about it all.

Considering how easy it is to automate things (camera orientation, object movement, light placement, etc) in 3d rendering programs these days, I'm surprised this is all they have. I guess that's why I naturally assumed they could deal with a model car around without dealing with wires.

I wonder if you can get grants for doing half-assed stuff like this. I should look into it.
posted by blacklite at 12:54 AM on February 7, 2005


« Older Michael Marcavage   |   Religion Newer »


This thread has been archived and is closed to new comments