"MMOs will be seen as being as cutting edge as 2D platform games are in 2009."
May 18, 2009 1:24 PM   Subscribe

SF author and mefite Charlie Stross speaks about video games in 20 years.

Virtual reality pioneer Jaron Lanier has some similar ideas.
posted by nushustu (80 comments total) 11 users marked this as a favorite
 


I'm just glad people have stopped blathering on about Second Life.
posted by Artw at 1:33 PM on May 18, 2009 [13 favorites]


I rooted for the gray goo, FYI.
posted by Mister_A at 1:33 PM on May 18, 2009 [4 favorites]


I'm just glad people have stopped blathering on about Second Life.

The thing is instead now they blather about Twitter. In a few years that will be done too, but at least while we had to hear about Second Life all the time there were also dongcopters and other amusing trolling.
posted by TheOnlyCoolTim at 1:36 PM on May 18, 2009 [1 favorite]


TheOnlyCoolTim: " there were also dongcopters"

Previously
posted by Joe Beese at 1:39 PM on May 18, 2009


Nah, people blathering on about Twitter is more like people blathering on about MySpace or Facebook, in that they are hyping up a phenomena that is unexpectedly there. Second Life hype was different in that it was all about journos hyping up something utterly expected (teh metaverse! William Gibson was right! etc) but not actually there at all (because it was crap and hardly anyone used it).
posted by Artw at 1:39 PM on May 18, 2009 [4 favorites]


Not to say that it isn't necessarily hype and blather, just a different kind of hype and blather.
posted by Artw at 1:40 PM on May 18, 2009


Man I could have given this speech in six words or less: "You better get to work building me a motherfucking holodeck motherfuckers!"
posted by ND¢ at 1:41 PM on May 18, 2009 [8 favorites]


Maybe it's just me, but I honestly didn't get a good sense from this article as to what is being predicted.
posted by Blazecock Pileon at 1:44 PM on May 18, 2009 [2 favorites]


"You better get to work building me a motherfucking holodeck motherfuckers!"

Who told you my mantra?
posted by Faint of Butt at 1:48 PM on May 18, 2009


Maybe it's just me, but I honestly didn't get a good sense from this article as to what is being predicted.

iPhones! With VR!
posted by Artw at 1:52 PM on May 18, 2009 [2 favorites]


Yeah apparently we will all wear little internet glasses hooked to our iPhones and the internet will be projected everywhere and stuff like that.
posted by ND¢ at 1:55 PM on May 18, 2009


Better be a fucking good speech after we've waiting twenty years for it.
posted by yoink at 1:55 PM on May 18, 2009 [1 favorite]


For comparison: video games 20 years ago

I was playing Nethack tonight. That's gotta be 20 years old by now?
posted by PeterMcDermott at 2:00 PM on May 18, 2009


/twiddles with Rogue on iPhone.
posted by Artw at 2:01 PM on May 18, 2009


Wow, cstross really knows his stuff. This is much more impressively technical opining I'd expect from a science fiction author. It's especially refreshing to see someone who's written about the "Singularity" describe processor technology growth as being bounded by a sigmoid curve, not an exponential shooting to infinity.
posted by heathkit at 2:02 PM on May 18, 2009 [3 favorites]


Also, I know we're all supposed to be all appearance-tolerant and all, but Jaron Lanier's Crusty-trustafarian looks really silly on a man of what, 48?

That man makes Richard Stallman look downright respectable.
posted by PeterMcDermott at 2:05 PM on May 18, 2009


iPhones! With VR!

My opinion is that this not a very imaginative projection for 20 years, given where cell phones were 20 years ago.
posted by Blazecock Pileon at 2:06 PM on May 18, 2009


You should see Bruce Sterling in his old lady hat.
posted by Artw at 2:07 PM on May 18, 2009 [1 favorite]


I think there will probably be some, as he calls them, "wildcard technologies" that arise to help on the CPU side of things. If he'd have given this speech in 1979, he wouldn't have predicted RISC, for example.
posted by blenderfish at 2:08 PM on May 18, 2009


Wait, someone still uses RISC? Other than X-Boxes* and weird embedded stuff?

* Man, I bet MS didn't predict that one.
posted by Artw at 2:15 PM on May 18, 2009


Wait, someone still uses RISC? Other than X-Boxes* and weird embedded stuff?

You mean besides x86, x64 (or i64 or whatever the hell they're calling it nowadays), ARM, and PowerPC?
posted by blenderfish at 2:17 PM on May 18, 2009


nushustu: SF author and mefite Charlie Stross speaks about video games in 20 years.

Well, shit, that's a hell of a long time for me to wait just to hear a guy talk, even if he is on the same blog as me.
posted by koeselitz at 2:19 PM on May 18, 2009 [3 favorites]


More and different tech detail, and a slightly different focus here, from his 2007 article on GamerDNA.

I like the "spirit world" analogy a lot.
posted by aeschenkarnos at 2:20 PM on May 18, 2009


x86 is CISC.
posted by Artw at 2:21 PM on May 18, 2009


x86 was CISC. Now it is dynamically translated into RISC for fully pipelined execution.
posted by blenderfish at 2:22 PM on May 18, 2009


Why do I think there will still be jumping puzzles? Sigh.
posted by maxwelton at 2:22 PM on May 18, 2009 [4 favorites]


So RISC has won by stealth?
posted by Artw at 2:26 PM on May 18, 2009


If Wikipedia's cites are to be believed, the X86 architecture makes up ~80% of server sales. I think all the other architectures are RISC or EPIC/VLIW. Even Intel tried to get away from CISC with Itanium, but it seems that AMD forced them to stick with x86 (unfortunately).
posted by BrotherCaine at 2:29 PM on May 18, 2009


Yep. Not a lot of PDPs, VAXes, or 68ks around nowadays. Evolve or die, I guess. :)
posted by blenderfish at 2:29 PM on May 18, 2009


Regardless of the history of the architecture, internally, modern x86 CPUs are RISC. A part of the motivation for the Itanium was the perceived advantage of being able to get rid of some of this translation hardware, but as time went on, that hardware was dwarfed by other, even newer bells and whistles, like superscalar (especially out-of-order) scheduling, and branch prediction.
posted by blenderfish at 2:33 PM on May 18, 2009


I read this the other day and thought it was really great. I kind of wish Charlie would spend more time on this kind of thing, as I find it much more engaging and believable than his fiction.
posted by adamdschneider at 2:39 PM on May 18, 2009 [1 favorite]


I Am Not An Assembly Programmer, but surely if you program against CISC, even if the internals of the thing are RISC, it is still more properly described as CISC?
posted by Artw at 2:42 PM on May 18, 2009


I Am Not An Assembly Programmer, but surely if you program against CISC, even if the internals of the thing are RISC, it is still more properly described as CISC?

It's a semantic question, so that's as fair a definition as any.

I'm just talking about what happens inside the chip. The innovation of RISC was "hey, lets make instructions simple enough that we can easily fully pipeline execution." The obvious way to do this (MIPS, ARM, PowerPC, SPARC, etc.) is to come up with a new, simple, "RISC" instruction set. Intel, on the other hand, decided to stick with their old "CISC" instruction set, but declare that certain operations would now still work, but would run slower than certain other ones, because they would be translated into many RISC instructions, or wouldn't run superscalar, or whatever.
So the right thing to do, as an assembly programmer or compiler writer, program against the non-nerfed RISC subset of the CISC instructions.
posted by blenderfish at 2:47 PM on May 18, 2009 [2 favorites]


And, more to the point, that innovation (pipeline above all else,) as obvious as it may seem in retrospect, has enabled a tremendous increase in processor power over the years, and might not have been forseen by a futurist writing before it happened.
posted by blenderfish at 2:49 PM on May 18, 2009


Fairy nuff.
posted by Artw at 2:50 PM on May 18, 2009


I'd also like to point out that, for the most part, RISC v CISC v whatever is irrelevant to the vast majority of people writing code these days; the specifics of the instruction set are as relevant to (for example) people who write Ruby as the details of engine manufacture (not engine use, but the plants that manufacture engines) are to the average driver.
posted by Fraxas at 2:59 PM on May 18, 2009


That was amazing.
posted by AdamCSnider at 2:59 PM on May 18, 2009


I wonder if all the virtualization that's starting to go on will change things significantly as well.
posted by Artw at 3:00 PM on May 18, 2009


Gamer
posted by P.o.B. at 3:07 PM on May 18, 2009


I wonder if all the virtualization that's starting to go on will change things significantly as well

Well, there is always the perennial prediction (that arises every 5 years or so) that applications will move to a central server. Virtualization provides a set of tools which may help this come to pass (or not.) Handing off virtualized machines between devices (including moving it to a powerful remote machine) would be pretty damn cool.
posted by blenderfish at 3:09 PM on May 18, 2009


the specifics of the instruction set are as relevant to (for example) people who write Ruby as the details of engine manufacture (not engine use, but the plants that manufacture engines) are to the average driver.

Yep, but if you're trying to establish what cars will look like in 20 years, those details become a lot more important.
posted by blenderfish at 3:10 PM on May 18, 2009


I look forward to holodecks. I do not look forward to the hysterical calls for holodeck censorship that will inevitably follow. They'll make Jack Thompson's ravings seem positively restrained.
posted by PsychoKick at 3:23 PM on May 18, 2009 [3 favorites]


I wish Bruce Sterling would post on MeFi.
posted by fraxil at 3:54 PM on May 18, 2009


He, um, posts some good links on Twitter.
posted by Artw at 3:57 PM on May 18, 2009


(I can't bear the way he formats his blogs though. Which is a shame, because I love his stuff)
posted by Artw at 4:02 PM on May 18, 2009


I thought that was a really worthwhile read. From his blog he does take time to go out of his way to visit places like MIT Media Lab so I am not surprised his knowledge of the technology is ahead of most other sci-fi authors.

I think maybe the missing element is some analysis of where games have evolved from over the last several hundred years as a way of predicting where they might head. For most of their history games and sports have been played out in the solid world of wooden bats, bits of rope, sweating opponents who stare us out and improvised pitches. When computer games came along their hardware limited them into being largely solitary pursuits played while sat in a chair within a short wire's length of a limited capability display. In the last decade those limitations have started to fall away and they will continue to do so.

So what exactly are the traditional games which may make a mutated revival in this untethered world?
posted by rongorongo at 4:30 PM on May 18, 2009


Um hate to be a kill joy but he could have just linked to this it's a sphere!
posted by I Foody at 4:33 PM on May 18, 2009


I wonder if all the virtualization that's starting to go on will change things significantly as well.

We'll have the existing emulator scene to a higher degree. Like in (some Gibson novel) where they describe bands as having this existence out of sync with physical time - the early years of the band remain popular forever, completely disconnected for the band as they are today. NES games will exist forever, playable anytime, anywhere. Kids will eventually think of Pac Man or Super Mario like email or contact management on business PDAs - ubiquitous and annoying if they're not there.
posted by GuyZero at 5:11 PM on May 18, 2009


I don't feel like his predictions are very ambitious, but VR hasn't arrived like previously expected, so you gotta give him some credit. So I predict : sex-suits mean porn & porn games go mainstream, and thus get plot.
posted by jeffburdges at 5:27 PM on May 18, 2009


Games seem to be about fulfilling basic human needs-- learning skillsets, hunting & gathering, doing battle, exploring, building things, building relationships, being part of a group, etc. Early games (i.e., pong) seemed to fall exclusively into the "learning a skillset" category, but it seems like games are, in addition to this (and sometimes instead of this) increasingly fulfilling more complex human needs. WoW is a hunting & gathering (i.e., pre-agriculture) life simulator, for example.
posted by blenderfish at 5:42 PM on May 18, 2009


Has there been some quantification of the throughput required for full VR? The calculations I've seen vary wildly, but Warren design vision claims ~4 Terabits/second for visuals alone, I suspect tactile, hot-cold, and motion are even more. I have no idea where smell is, and sound seems to be the only sense that computers can "max out" on in terms of supplying sufficient information today.

Assuming that maximum throughput of various computer technologies increases predictably, and there are no disruptive technologies or discontinuities, shouldn't it be pretty easy to guess within a year or two when VR will be seamless (i.e. observer cannot tell when they are in the machine)?
posted by BrotherCaine at 5:52 PM on May 18, 2009


I really liked how he presented his ideas and especially the caveats. However, the super-iPhone + glasses thing is straight out of William Gibson's Virtual Light.

If things go in the direction that CStross is predicting, it suggests that games like Assassin get really popular and I fully expect to be seriously annoyed by these gamers in my old(er) age.

As for the codgergamer market, a big chunk of gaming is wish/fantasy fulfillment (granted, competition is another big aspect, but in digital gaming, the hook is wish fulfillment and you keep playing to achieve that fulfillment and/or for the competitive aspect) - I have no idea what 60 year olds will want to fantasize about? Being able to 'get it on' in the sack (like jeffburge's sex suits)? Raising offspring again (ala Sims)? Being back at work?

If I had to guess, I'm thinking that games where you manage virtual sports leagues could be a profitable venture. Play as a general manager, coach (in game situations, running practices in real-time), what-have-you for a real-life or fantasy sports league with virtual players where all the other teams in the league are managed and coached by other people.

As for the biotechnology-in-your-kitchen thing, media types have this strange huge hardon for DIY biotech. I don't see it. Sure, maybe stuff like chemistry sets in the '50s and '60s. Has anything really been achieved by DIY chemists? The only things that come to mind are recreational drugs and that kid who built his own breeder-reactor. I doubt that people, in 20 years, are going to routinely roll their own miniature pink elephants (with wings!) in their kitchen. Besides, given the trend in property prices, who'd be able to afford a kitchen big enough to house the elephant that going to gestate the thing?
posted by porpoise at 7:15 PM on May 18, 2009 [1 favorite]


I have no idea what 60 year olds will want to fantasize about?

Chasing virtual kids off their virtual lawns.
posted by AsYouKnow Bob at 7:35 PM on May 18, 2009 [2 favorites]


VR glasses that allow overlays on top of what you're seeing have been my favorite futurepony for the last few years. There's just so much that you could do with that kind of display and a bit of GPS/compass/image recognition. I imagine something like channels of content one could subscribe to--one that adds interesting graffiti to walls, one that overlays people's latest twitter updates or their latest songs from their iPhone playlist above their heads (hopefully in an opt-in way), one that identifies what ARGs they play in (so that when you and the other person recognize each other's gamebadges you can immediately talk in-character), one that lets you gesture at unfamiliar words or images on a printed page (if we still have printed pages) or sign and see their wikipedia article. And so on. Like the iPhone app store for reality. It could be so awesome.

On the other hand, I think that people have a real need for games that are relaxing and comforting and played from their sofas/computer chairs. I don't think that kind of game is going to go away, if for no other reason than everyone's inherent laziness, you know? And I think that "games are going to be tied more to physical reality" and "games are going to be for old people" are kind of at odds (unless what we now think of as regular games are going to become "old people games").
posted by rivenwanderer at 7:38 PM on May 18, 2009


I thought it was all about the Playstation 5?
posted by nudar at 9:11 PM on May 18, 2009


The best thing VR overlay glasses would have would be Ad Block.
Imagine replacing all the billboards in your city with artwork of your choice.
posted by Iax at 10:21 PM on May 18, 2009 [3 favorites]


I liked the idea of equipping and iPhone with little mini-projectors to make use of available surfaces to increase screen space.
posted by KokuRyu at 11:48 PM on May 18, 2009


The Playstation 5 will be implemented as a mist of nanobots which you inhale, and which then rewire your nervous system.
posted by acb at 2:52 AM on May 19, 2009


Gamer

Wow, for a Running Man retread, that actually looks pretty good.
posted by Happy Dave at 4:28 AM on May 19, 2009


The most interesting thing is that he admits excluding energy decline, climate problems, etc. That's honesty you don't see a lot these days.
posted by symbollocks at 6:07 AM on May 19, 2009


VR glasses that drop overlays on top of what you're seeing have been my least favorite nightmare for the last few years. There's just so much extraneous horse hockey that you could dribble onto your visual cortex with that kind of display and a bit of GPS/compass/image recognition. I dread something like channels of content you would feel obliged to subscribe to--one that adds inane graffiti to walls, one that overlays people's latest twitter updates or the latest songs from their iPhone playlist above their heads (probably with no way to opt out), one that identifies what ARGs they play in (so things get really awkward at work), one that lets you gesture at unfamiliar words or images on a printed page (god, if we still have printed pages!) or sign and see their Wikipedia article. And so on. Like Facebook apps for reality. It would be so terrible.
posted by Your Disapproving Father at 7:39 AM on May 19, 2009 [2 favorites]


BrotherCaine wrote The calculations I've seen vary wildly, but Warren design vision claims ~4 Terabits/second for visuals alone

I don't think, even for full VR you'd need anything like that. Yes, that's what the eye, theoretically, takes in, but its not what you need for photorealistic VR. In the second place, much of that is multiple eye takes on a single static image. If you're sitting, looking at a blank sheet of paper, yes you're probably taking in 4 terabits/second. But to perfectly represent that blank sheet of paper in VR goggles wouldn't take 4 terabits/second. We take in that much because a lot of what we get is redundant, because the brain is twitching the eye to make up for some serious design flaws [1], etc.

In other words yes, that's what the eye needs to make stuff look real, but its not what a computer would need to project stuff that looks real. And, of course, there's the "what is the subject looking at" issue. You'd want to dedicate enough processing power, bandwidth, etc to the object the user is focusing on, but that's likely a relatively small fraction of their total field of vision. Define your virtual world, then only waste the processor to fully, perfectly, render what the subject happens to be looking at that instant. Right now I'm seeing the letters forming after my cursor perfectly, the rest of the monitor is only vaguely in my awareness, and the rest of the room is a vague blur. No need to perfectly render the flower on the corner of my desk when I'm not paying attention to it, right?

Same goes for kinesthetics, touch, sound, etc. Yes, they're important, and in certain instances you'd really want to go full bore with them, but mostly its something you don't notice unless something is wrong.

Not that full VR won't take a staggering amount of processing power, storage, and an interface that's insane, but its not quite the problem that the raw numbers might make it appear.

And, of course, there's always the question of "how real is real enough". Not all applications want or need photorealism, or the VR equivalent thereof. Long before we have the ability to do true VR, we'll have reached the uncanny valley point for cartoon equivalent VR, and I'd imagine that the entertainment market for stuff just short of the VR uncanny valley would be huge.

[1] a "blind spot", what a brilliant idea! I've always said that the best argument against "Intelligent Design" is how *stupidly* designed much of the human body is.
posted by sotonohito at 7:45 AM on May 19, 2009 [1 favorite]


when we have contact lenses that can project past lectures or images of textbooks onto our eyes from a recording of our life that can be accessed anywhere we are, what will that do to the institution of education? or will there just be no need for it, so it won't matter?
posted by jermsplan at 9:15 AM on May 19, 2009


Will city streets be overrun by people playing games? Everyone logged in is marked in your vision, virtual weapon and cash pickups litter the city, getting killed forcibly locks you out of the game for a predertermined respawn time? You can only join the game if you standing within a designated join/spawn point? City parks marked as control points, bases or flag locations? That sounds amazing, and frankly I just hope my body is able to keep up with the 20 somethings of 2020.

Also, it sounds like it would cause real world death, destruction and chaos for those not in the game. But given the technology he predicts, how could you prevent it?
posted by jermsplan at 9:24 AM on May 19, 2009


Discussions of visual cortex bandwidth completely neglect the processing power necessary to synthesize the images in question. For example, the amount of technology required to stream the 36Mbit/s bandwidth (more if we're talking the HDMI stream, which is probably a better analog to the bandwidth you're talking about) from a Blu-Ray to watch Wall-E (that is, a PS3,) is ridiculously less expensive than the amount of technology required to render it (a huge render farm at Pixar,) let alone to render it in real time.
posted by blenderfish at 10:29 AM on May 19, 2009


Actually, ARM is about 95% of everything, consumer and embedded both — as long as you remember that the vast majority of consumer-owned computers are phones or embedded gizmos

Is that really true? Or does it depend on a very specific definition of "computer" that excludes anything microcontrollery?

(MIPS is still alive and well in the embedded sector, FWIW.)
posted by We had a deal, Kyle at 10:43 AM on May 19, 2009


Heh. Interesting that a discussion on the future of gaming has boiled down to a discussion of how good the graphics will be. Some things are just unchanging.
posted by Artw at 11:44 AM on May 19, 2009 [2 favorites]


Blenderfish, I considered rendering to be implicit in the concept of saturating visual bandwidth, but even so, I think the bottleneck may actually be display technology for proof of concept designs at least (rather than widespread adoption). The render farms will eventually be up to the task both because the individual processors/GPUs that work on it have been doubling in complexity every 18 months or so, and because it's partly an example of an embarrassingly parallel problem, where additional processors can be added until there are enough for real time rendering. Phenomenally expensive at first, but maybe it could be a rich man's game rental.

Sotonohito, you are right that we technically only need to render what the eye is looking at, but I'm not sure tracking pupil movement and rendering reactively will be a trivial problem before we have the full 4 Terabits/sec. I'm sure someone could make an informed guess as to which is more feasible, but I don't know enough about the technical complexity. Regardless, as you point out, we'll get to "good enough" for most people's purposes long before we reach something completely seamless.

ArtW, gaming has been and always will be composed of something like .1% brilliant, 9.9% good, and 90% crap. The only difference the future will bring is how shiny the turds are. My one hope for a future technology that enables more interesting games is the idea that between decent physics engines, and genetic algorithms + computing power, someone will start to create totally immersive and believable worlds from completely alien starting assumptions. Or even a completely open ended / infinite world. That still won't substitute for crap writing. I'm also interested in algorithmic composition, but advances there seem to be more an issue of how insightful the composer is than the technology involved.
posted by BrotherCaine at 12:35 PM on May 19, 2009


For a pretty readable fictionalisation of what Stross talks about here, check out his recent novel Halting State, blurbed as being, of all things, "a thriller set in software houses that write multiplayer games" . . .

It has some interesting expansion on the idea of computer games moving into the spectrum of LARPing by using the projection of the web onto the real world which he mentions in the OP.
posted by protorp at 12:51 PM on May 19, 2009


Twenty years is a very long time in biotechnology, with exponential progress accelerated further with each discovery and invention. An iPhone with VR goggles seems like a silly futurist vision, compared with what we had twenty years ago. I'd prefer to see eXistenZ happening sooner rather than later, with networked, neural implants projecting shared artificial worlds and augmented realities directly into our minds. Who needs to carry gadgets around in pockets? That kind of progression in gaming would change humanity in a pretty fundamental way, I think, culturally, artistically, economically, politically, as opposed to just buying the latest hot, new gadgets.
posted by Blazecock Pileon at 3:01 PM on May 19, 2009 [1 favorite]


Blenderfish, I considered rendering to be implicit in the concept of saturating visual bandwidth

I don't so much, because my life's work is making sure the correct (or at least most visually pleasing) 1's and 0's go over that HDMI cable you have coming out of your XBox 360. (A winning lottery number is what, 40 bits? You just need the correct 40 bits.)

But, yes, renderers will certainly keep getting better and better.
posted by blenderfish at 3:12 PM on May 19, 2009


And, yes, it has been pretty disappointing that nobody has made headway on practical visual interfaces beyond screens. (Though, screens have gotten bigger and cheaper.)
posted by blenderfish at 3:37 PM on May 19, 2009


And, yes, it has been pretty disappointing that nobody has made headway on practical visual interfaces beyond screens.

Visual data doesn't have to go through the eyes. There are other possible routes, such as the tongue. With that in mind, I can imagine several alternate approaches involving "sensitized" patches of skin and I/O contact pads.
posted by PsychoKick at 3:45 PM on May 19, 2009


Though, screens have gotten bigger and cheaper.

I keep hoping they'll get more contrast, pixel density, and convexity.

or at least most visually pleasing

I'm not sure how the future is going to bring out much in the way of visual talent. For every genius who gets access to a workstation and render farm, there'll probably still be 8 hacks. I hope that there'll be technology to render aesthetically pleasing natural terrain algorithmically, but level designers may still create absurd and or ugly hulking buildings to clutter things up (and some brilliant designs as well).
posted by BrotherCaine at 6:23 PM on May 19, 2009


I'm glad protorp mentioned Halting State, which was pretty good fun. Another similar themed near future read is Rainbow's End by Vernor Vinge which also has ubiquitous super-imposed visuals. As does Accelerando also by Stross, also good fun, and starting near future then hitting the singularity.
posted by bystander at 9:44 PM on May 19, 2009


Oh man, pico projectors. They're going to be sweet.

http://www.youtube.com/watch?v=7UfarRM0BoM
posted by RufusW at 10:21 PM on May 19, 2009 [1 favorite]


Also, it sounds like it would cause real world death, destruction and chaos for those not in the game. But given the technology he predicts, how could you prevent it?

The same way we prevent soccer riots and street racing deaths!
posted by mecran01 at 6:54 AM on May 20, 2009




My G1 Android phone already does Augmented Reality. I can hold it up (with the free Wikitude app) and see information about what I'm looking at overlayed on the screen image. I can hold it up and see the constellations overlayed over the night sky with Google Sky.

Exciting times. F an iphone.
posted by Espoo2 at 10:53 PM on May 20, 2009


« Older Clever clever logo logo   |   Total Recall Newer »


This thread has been archived and is closed to new comments