2015: The year that sci-fi becomes real
January 22, 2015 10:48 PM   Subscribe

"After locking away all my recording instruments and switching to the almost prehistoric pen and paper, I had a tantalizingly brief experience of Microsoft's HoloLens system, a headset that creates a fusion of virtual images and the real world. While production HoloLens systems will be self-contained and cord-free, the developer units we used had a large compute unit worn on a neck strap and an umbilical cord for power. Production hardware will automatically measure the interpupillary distance and calibrate itself accordingly; the dev kits need this to be measured manually and punched in. The dev kits were also heavy, unwieldy, fragile, and didn't really fit on or around my glasses, making them uncomfortable to boot. But even with this clumsy hardware, the experience was nothing short of magical." ...

Minecraft comes alive

"We ran through three interactive demos and watched a fourth. My favorite was Minecraft. I was in a room with a couple of tables and a picture on the wall. After putting on the headset and looking around the room so that the HoloLens could figure out where everything was, the world around me suddenly transformed. The table was no longer just a table. It had a big castle on it, with a river flowing beneath. The middle of the table was no longer there; I could peer through the hole to see the river below. As I moved around the room, I could examine the castle from all angles.

My attention was then turned to a second table which had something of a zombie infestation. Fortunately, the foolish zombies were clustered around a block of TNT. Detonating the TNT blasted through the table, revealing a lava pit below. The zombies toppled through the hole and fell to their deaths.

The picture frame on the wall now housed more TNT. Triggering it revealed a large cavern beyond—and out flew a bunch of blocky bats.

Through it all, the 3D effect was thoroughly convincing. The system felt very low latency; as I moved my head and walked around, the objects retained their positioning in the real world, with the castle, for example, never becoming detached from or wobbling around on the table. While Minecraft of course falls some way short of having photorealistic graphics, the melding of real and physical nonetheless felt convincing.

If Microsoft can get the price of HoloLens right, it could become the must-have Minecraft accessory at Christmastime. Microsoft's decision to buy Minecraft's developer all of a sudden makes sense."
posted by SpacemanStix (141 comments total) 28 users marked this as a favorite
 
If Microsoft can get the price of HoloLens right, it could become the must-have Minecraft accessory at Christmastime.
Me? I'm just looking forward to the Black Mirror episodes about this.
posted by fullerine at 11:04 PM on January 22, 2015 [24 favorites]


A guy i know works at MS. He has for long enough that he still has his "welcome kit" somewhere which has a bunch of windows 3 era art on the box.

They've made a ton of stuff like this over the years that was never publicly flogged. They've been messing around with the surface hub for probably the past 5-6 years or even longer.

From the amount of hype and attention this is getting, it's like everyone forgot the huge wankfest over kinect when those videos came out.

The technology behind kinect is cool. But the actual product is disappointing. It has the capabilities to do everything in those videos, but just kinda... well no one ever used it for anything like that.

This could very well be a super cool thing from a technical standpoint, but i'm kinda holding off on being excited until i see some actual software for it and see the price.
posted by emptythought at 11:09 PM on January 22, 2015 [18 favorites]


Like, this has me much more excited for some reason. Especially if/when it can reach some certified level of accuracy in a couple years. I can think of a million situations in which that sort of tech could even save peoples lives.

I don't know why i can't drum up excitement for this, but i have to just roll my eyes whenever microsoft goes "the future is NOWWW!"
posted by emptythought at 11:12 PM on January 22, 2015


This could very well be a super cool thing from a technical standpoint, but i'm kinda holding off on being excited until i see some actual software for it and see the price.

I don't know why i can't drum up excitement for this, but i have to just roll my eyes whenever microsoft goes "the future is NOWWW!"

Yeah, I'm trying not to get too hyped about something that isn't here yet, but it's probably the most excited I've gotten for a possible near-future technology that actually has a working prototype. For some reason, this tech, even hypothetically, hits all the right buttons for me. Kinect I didn't find that exciting. I think it might have something to do with the divide between the public and private, a world you can get lost in on your own versus tech that stays put in a public space.

I'm putting this in the please oh please oh please oh please happen category.
posted by SpacemanStix at 11:23 PM on January 22, 2015 [5 favorites]


A prediction.
posted by oneswellfoop at 11:23 PM on January 22, 2015 [24 favorites]


Kinect has been suffering from the fact that all video gear wants to live at a leisurely 30hz or 60hz. You really need to get up to 120hz-480hz for things to feel right, but it's an awful pipeline to clean up to get there.

We're finally getting there.
posted by effugas at 11:23 PM on January 22, 2015 [2 favorites]


Note to Microsoft: under promise and over deliver. See: Apple.
posted by mazola at 11:30 PM on January 22, 2015 [6 favorites]


This sounds like Dennou Coil.
posted by Chocolate Pickle at 11:42 PM on January 22, 2015 [5 favorites]


Absolutely no holography involved in this thing either as far as I can tell. Marketing wankspeak ahoy!
posted by GallonOfAlan at 12:08 AM on January 23, 2015 [3 favorites]


We're finally getting there.

Thanks in no small part to John Carmack and the Oculus. :-)
posted by smidgen at 12:10 AM on January 23, 2015 [2 favorites]


This couldn't turn into a nightmare.
posted by adept256 at 1:06 AM on January 23, 2015 [1 favorite]


Gestures are terrible as input
posted by sonic meat machine at 1:08 AM on January 23, 2015 [16 favorites]


In 2019, they'll switch over to just bolting them permanently into the sides of your head.
By 2025, the way we live now will seem like loin cloths and the quest for fire.
posted by Fupped Duck at 1:36 AM on January 23, 2015 [3 favorites]


Before you turd or meh all over this... look back at a post that is a little less than five years old. In January 2010 another company announced an innovative new product.

The tone there is the same as I see above. Let's learn from the past. Before you sit on your thrown and type turds on technology that seems far out and useless... consider how something far less innovative has changed your life in ways that were intangible on this date exactly five years ago. Not only could you not type a post on the toilet, the capability awoke the possibility which made imagination unnecessary and hell yeah you would pay billions to do it since then!

I like this analysis:

"Today, when people say they want a “tablet” what they really mean is an iPad, and when people say “Google” what they really mean is “internet search.”... HoloLens is Microsoft’s attempt to insert itself into this new field before others come to own it."

I'm really excited about HoloLens - it looks really really promising. I was also initially excited by Google Glass. That excitement wore off quick. But the leap in renderable detail and information articulation from GG to HL is amazing. Glass now feels like a slapdash hack that was perhaps whipped together in light of leaked information on a multi-year marathon happening in Redmond. This looks far more substantial and worthy of excitement. It raises the bar in a way that will drive the market with considerable force.

No matter who ends up owning the verb for the era beyond screens, SpaceManStix is right that this is strong confirmation that a big leap into a new world is about to happen. We are like creatures emerging from a Flatland sea onto a shoreline of a vast new immersive and collaborative information enriched world. Whenever information and freedom get together, there is a multiplicative positive effect for humanity.

There are perhaps three to ten companies gearing up in garages for a drive up Sand Hill road right now that will be bigger than the last decade's crop of 'who could have thought x could be big' titans. This will boost national GDP in that direct industry growth sense and - more profoundly - in trillions of newly realized efficiencies that we can be hard pressed to imagine as we float spinning through a potentially holographic universe wrapped around a planet stuck to the bottom of our bottoms on a porcelain ring.

No fear either. History has shown that the future has been better than we tend to fear and the way things were probably isn't as nice as we like to remember it. Metacafe's law will continue to ensure that more freedom means fewer shadows for lies, tyrants and ignorance. Our global self has no tolerance for local injustices and maybe we'll straighten the whole show out better once we're immersed even more fully in that.
posted by astrobiophysican at 1:53 AM on January 23, 2015 [14 favorites]


Does it let me goggle in to the metaverse?
posted by Annika Cicada at 2:03 AM on January 23, 2015 [3 favorites]


Lines of light, arranged in the nonspace of the mind...
posted by pompomtom at 2:33 AM on January 23, 2015


Before you turd or meh all over this... look back at a post that is a little less than five years old. In January 2010 another company announced an innovative new product.

The difference is that the ipad was a logical step from something that had existed before. I didn't shit on that, and i thought all the people making fun of it and crapping on it were hilarious. There were all the "ugh, it's just a big iphone" people... i was one of the "Oh SWEET! it's a big iphone!" ones.

Gestures are terrible as input

This is my biggest complaint. You can't tech your way out of gorilla arm. That is not a casual turdthrow.

I know it seems eager and easy to compare this to the ipad, and point out the fact that there was basically two decades of crappy junky tablets behind it that were only good for very very specific, mostly commercial uses(or as wacom tablets with displays behind them) but the ipad being thin and relatively light isn't what made it good.

Short of turning off gravity, i don't see how you can make waving your arms in the air not a crappy input method.

I'm not saying this is DOA or anything, and i think it's a stepping stone to something, but waving your arms in the air is kind of like pinch to zoom was to multitouch. It's a cute thing to show off the tech. Having that be the input is part of the reason this feels like dongwaving to me. It's like they wanted to go "look at all the cool stuff we have, it's the tech from kinect!".

Pretty much, when apple shows something off you feel like there's a point to features, the ways it works, and the tech it's showing off. Even gratuitous stuff like gonzo resolution screens immediately make sense once you try it out. None of it is ever something that looks cool but doesn't have that much of a functional purpose, or doesn't work that well. Microsoft has a recent track record of silly stuff that's basically muscle flexing with no real practical application. Remember the original surface, when it was a big table?

It's not so much that the killer app isn't here yet, it's that gorilla arming in the air in front of a head set sounds cool but seems to have completely left out consideration for the actual human putting the goggles on.


And to be totally clear, i'm not saying i wont be buying one. I might even try and get one of my blue badge friends to get one for me early if it goes on sale to employees first. I just think that input method is silly as hell, and will be uncomfortable or impractical. All the lampooning of the ipad was "i don't get the point", not "the point seems awesome but it seems doofy to use".

That, and the people who lamented typing on a big touch screen are still freaking right 5 years later. That shit sucks. No one developed a solution for that better than "get used to it" or "bluetooth a keyboard". That's still like, the gorilla arm of touchscreen tablets. So i don't know, maybe i'm overplaying this. Maybe everyone will buy these but just not use them that way. But it's still a totally fucking silly way to demonstrate them or imply they should be used. No one buys tablets to write an essay on though, unless they also buy a keyboard or are weird. Everyone i know just takes quick notes on them or draws.
posted by emptythought at 2:37 AM on January 23, 2015 [12 favorites]


I'll be more than skeptical until it comes with a magical cure for the widespread cultural aversion people have to dealing with other people who are wearing something that could record them or communicate things they can't see.

See glasshole.
posted by deadwax at 2:39 AM on January 23, 2015 [2 favorites]


Before you turd or meh all over this... look back at a post that is a little less than five years old. In January 2010 another company announced an innovative new product.

The iPad? I think mine's in a drawer around here somewhere.
posted by grahamparks at 2:40 AM on January 23, 2015 [1 favorite]


Between the staggering inequality and the cool eye tech, it is starting to look like Virtual Light is going to be the SF novel that predicts our actual present.
posted by selfnoise at 2:55 AM on January 23, 2015 [6 favorites]


Kinect was disappointing in the long run for gamers yes, but it sold a ton of xboxes.
posted by Another Fine Product From The Nonsense Factory at 3:01 AM on January 23, 2015


Another prediction.
posted by zardoz at 3:14 AM on January 23, 2015 [3 favorites]


Actually I see MS buying Oculus Rift for a bazillion dollars and working some alchemy to merge the two.
posted by zardoz at 3:15 AM on January 23, 2015


And to be totally clear, i'm not saying i wont be buying one. I might even try and get one of my blue badge friends to get one for me early if it goes on sale to employees first. I just think that input method is silly as hell, and will be uncomfortable or impractical. All the lampooning of the ipad was "i don't get the point", not "the point seems awesome but it seems doofy to use".

On the one hand yes, on the other... i can sort of see 'reaching out to touch the imaginary thing you can see in your visor' as a very natural haptic event.

Obviously they need a huge degree of fidelity of what you see to what you do to make that work, but if the tech is good enough I can imagine the immersion being very compelling, which might make it worthwhile.
posted by Sebmojo at 3:27 AM on January 23, 2015


This'll be awesome... in a decade or two.
posted by fairmettle at 3:28 AM on January 23, 2015 [1 favorite]


Is gorilla arm lessened if your upper body is in good shape?
posted by Pope Guilty at 3:29 AM on January 23, 2015


For e.g., if you were able to sit down at a table and your keyboard was projected on the table, and the screen in front of you, and it felt natural and effective.

That's pretty cool.
posted by Sebmojo at 3:30 AM on January 23, 2015


Having been burned by Kinect, I was initially skeptical about Hololens. However, the sheer number of incredibly positive reports from journos has convinced me that there are legs to this device.

I can totally believe that the demos they played were rigged in order to work as smoothly as possible, but the fact that the notoriously cynical Rock Paper Shotgun was super-excited by the Minecraft and Mars demos is a great sign.
posted by adrianhon at 3:38 AM on January 23, 2015 [3 favorites]


It's very similar to Jeri Ellsworth's Cast AR glasses, too, although that needs a tablemat for the projections to appear above.

There's a lot of different people working on augmented reality and virtual reality stuff at the moment and it's pretty exciting to think that maybe something very good will actually come of it this time, in comparison to all the other times.
posted by dng at 3:52 AM on January 23, 2015 [1 favorite]


For e.g., if you were able to sit down at a table and your keyboard was projected on the table, and the screen in front of you, and it felt natural and effective.

Without physical keys under your fingers this would be uncomfortable at first and probably painful in short order.
posted by Pope Guilty at 3:56 AM on January 23, 2015 [1 favorite]


I've typed 2000 word stories on my ipad and found it pretty comfortable; how is it different from that?

I mean it's gonna be inferior for sustained work, but the idea of having a virtual computer that only you can see is neat.
posted by Sebmojo at 4:00 AM on January 23, 2015


Employee training would be my killer app (yeah, okay the schools can use it, too -- take the kids inside a sailing ship and let them fire the virtual cannon). But actual training space set up with whatever fixtures and fittings will emulate the working space is hella expensive -- especially when the damn workspace changes completely every couple of years -- so employees in branch offices all have to be shipped to the nearest training center all the damn time. In northern Canada, this is both expensive and very season-dependant.
posted by Mogur at 4:26 AM on January 23, 2015 [2 favorites]


...look back at a post that is a little less than five years old. In January 2010 another company announced an innovative new product.

Had the iPad required the user to, for instance, wear a glove in order for the pad to register touch and gestures, it would have been DOA. Similarly, any tech that requires the user to don some sort of headgear isn't going to go anywhere with the general public. I can see great applications for this in science and industry, though (and, of course, gaming.)
posted by Thorzdad at 4:27 AM on January 23, 2015 [1 favorite]


The technicians turned the unit on. I was standing in front of a castle gate. The portcullis slowly opened. Out of the mist a figure appeared. He was wearing tan slacks, a light blue short-sleeved button up shirt and a red and dark blue striped tie. It was Bill Gates. He walked up to me, looked me dead in the eyes and said, "When was the last time you actually paid For a copy of Windows?
posted by double block and bleed at 4:27 AM on January 23, 2015 [2 favorites]


Actually I see MS buying Oculus Rift for a bazillion dollars and working some alchemy to merge the two.

They'd have to buy it from Facebook ...
posted by psolo at 4:28 AM on January 23, 2015 [3 favorites]


I still think tablets are pretty terrible. There are a few things they're great for, like choosing the video that you're watching on a streaming service, or looking at Facebook, but in general they are inferior to a computer, and overall they still give me the heebie-jeebies due to their segregation of computing power.

All of that said, I still think the use of gestures for input is terrible, and will always be terrible, because of their general lack of precision. For example, the "wow" here is the Minecraft clone, right? Fine. Question: how do you select a block to hold? How do you place it? How do you climb down into the mine that you just dug in the middle of your coffee table?

How can visually- or mobility-impaired people interact with HoloLens software? What about Asians, who get motion sick extremely easily?
posted by sonic meat machine at 4:31 AM on January 23, 2015 [5 favorites]


emptythought: Gorilla Arm was specifically talking about touch-based input with a seated user and a screen placed according to 'traditional' monitor ergonomics: at least two feet away, with the top of the screen level with the user's eyes.

Hold your arm at your side, then bend your elbow so your lower arm is parallel to the floor. No strain, right? Now slowly raise your whole arm from the shoulder. You'll notice a serious increase in strain at the shoulder joint when your upper arm passes past about 45 degrees relative to your body. Hold it there for ten seconds. That's Gorilla Arm.

Gesture-based input is a way bigger field of options. There's no reason a virtual surface needs to live two feet away from your body. I can see an interface like this offering an activity-level scale: want to wake yourself up or get a few more bonuses on your FitBit? Go to maximum scale where you're dragging things around at full extension. Feeling sore or tired? Work at minimum scale where you're just using your hands.

Kids are already used to games that involve physical input, whether it's Kinect or shaking / flipping an iPad. Why wouldn't the work they grow up to do offer them the same options?
posted by sixswitch at 4:33 AM on January 23, 2015


Cool! This will make online harassment feel much more immersive. GamerGaters will seem like they're right there.
posted by Cash4Lead at 4:49 AM on January 23, 2015 [2 favorites]


a magical cure for the widespread cultural aversion people have to dealing with other people who are wearing something that could record them or communicate things they can't see
Badge cameras? Soon police brutality stories will all be of the form "and he might have gotten away with it too if it weren't for that meddling camera" or the form "and his camera wasn't working, which on one level means there's no evidence against him but on another level is probably just evidence of premeditation". A decade or so of "evil people don't record their surroundings" subtext might have some cultural effect.

Growing awareness that internet-camera-wearing people can't be robbed without the robber's face being irrevocably recorded might be sufficient, too. We've already lost our cultural aversion to dealing with *places* where there's something that can record us at all times, for this reason. Do you avoid shopping at stores with cameras? How would you eat?
posted by roystgnr at 4:55 AM on January 23, 2015 [2 favorites]


oneswellfoop: "A prediction."

Wait till he hears that we are still singing and dancing and making comedy films!
posted by vanar sena at 4:55 AM on January 23, 2015 [2 favorites]


I tried an Occulus Rift last week, and while it was pretty cool, it was hard to get over the disconnect with the "real" world. If this fixes that feeling, I'm in!
posted by blue_beetle at 4:56 AM on January 23, 2015


Guys, the gesture based input is less than half the story here, since we already have kinect. The stable low latency AR display is the awesome bit. I see myself using it even just with a regular mouse and keyboard. I don't know how good the resolution is, but I love the thought that I could project a bunch of huge screens into my room - while programming, for example - or look at graphs and charts in rendered in actual 3D onto my desk.

IMHO this is probably not as exciting for immersive gaming as the Rift etc will be, but for practical, boring things it's not hard to come up with clever use cases.
posted by vanar sena at 5:05 AM on January 23, 2015 [5 favorites]


Science fiction became real for me years ago, but now y'all are grey and there's turtles running antique shops and no one believes me.
posted by sonascope at 5:06 AM on January 23, 2015 [8 favorites]


If, in the future, gators want to join us in our own virtual spaces then they'll likely have to visually deal with whatever horror-landmines we leave for them. Trolling someone else is all fun and games until their space codes for a fairly sizable spider/silverfish/millipede cralwing up from the unseen side of the intruder's desk. Which would be very, very light fare. Imagine a defense program that tricked the attacker into meandering through a partial Silent Hill PT run.
posted by Slackermagee at 5:09 AM on January 23, 2015


Also, that virtual Minecraft thing looks like playing Legos, but without the terrible burden of having to actually touch things. Of course, you can use object mapping to turn the tissue boxes on your feet into glamorous Italian shoes, so there's that.
posted by sonascope at 5:09 AM on January 23, 2015 [1 favorite]


Probably not going to become A Thing until they look like ordinary glasses. Wireless power will make that happen sooner rather than later.

Actually, no... probably not going to become A Thing until Apple releases their own interpretation of the concept that seems so intuitive and natural to use for everything, everyone forgets that it was first envisioned by Microsoft Google Occulus Silicon Graphics William Gibson Frank L. Baum.
posted by Slap*Happy at 5:14 AM on January 23, 2015 [1 favorite]


I'll be more than skeptical until it comes with a magical cure for the widespread cultural aversion people have to dealing with other people who are wearing something that could record them or communicate things they can't see.

The pictures I've seen of the current prototype don't look like anything that someone would want to wear to a bar or restaurant. And given the "glasshole" backlash, that might not just be a function of Microsoft's often-dubious design skills.
posted by Slothrup at 5:17 AM on January 23, 2015 [2 favorites]


I'm ridiculously excited about this, and I think the make-or-break issue will be the price point, it feels like they're targeting a more commercial/game audience which means something in the console range maybe? That would be interesting.

deadwax: See glasshole.

I don't see that being a problem since this seems to be a mainly "indoors" AR, more Occulus Rift than Google Glass.

GallonOfAlan: Absolutely no holography involved in this thing either as far as I can tell. Marketing wankspeak ahoy!

Well yes, but it does sound a hell of a lot better than Augmented Reality goggle doesn't it?

What would really make this for me would be whatever Microsoft's policy regarding openness ends up being. If they're going to release an SDK and allow people to develop for it, it would be the first time in a decade I'm willing [excited even] to be an early adopter. If it ends up being a closed platform with restrictions on content and usage then I can just wait another 5 years until a better iteration comes along.
posted by xqwzts at 5:22 AM on January 23, 2015


I've typed 2000 word stories on my ipad and found it pretty comfortable; how is it different from that?

It's not, and I find the iOS keypad almost completely unusable. Swype is the only reason I'm at all comfortable entering text on a touchscreen.


Well yes, but it does sound a hell of a lot better than Augmented Reality goggle doesn't it?

Not to me, but I've been super-excited about AR for years, so...
posted by Pope Guilty at 5:27 AM on January 23, 2015


It's probably just a play on "holodeck".
posted by Steely-eyed Missile Man at 5:41 AM on January 23, 2015


I've typed 2000 word stories on my ipad and found it pretty comfortable; how is it different from that?

Maybe my hands are just too big or something, but I find ipad typing unworkable except for short comments here or answering texts. Anything longer than a paragraph means that I shift to my laptop. I like the ipad because it is light and silent, so it works well for browsing and reading, but it doesn't work for me at all in terms of producing anything.

I could see these kinds of immersive goggles being useful for people who work with CAD and GIS as well as the above-mentioned training purposes, but it's not like I go through life wishing I had a hologram-style overlay on the world around me, and even more so an augmented overlay with a Microsoft-designed user interface. I have to use Windows at work and while it overall works ok, it is jam-packed with small irritations (almost microaggressions, really) and it is entirely on you as the user to adapt and adjust to.

Eventually one of these is going to get it right, like Apple did with the ipad, that works for millions of people and they will become at least semi ubiquitous, but none of these so far has given me any excitement or interest.
posted by Dip Flash at 5:44 AM on January 23, 2015


The year that sci-fi becomes real

Came to post excitedly about FTL, but the real issue will be authoring systems. That is unless it'll be released with five or six FPS and a couple next gen dance with an imaginary partner apps, sell a million and then announce the next x-box-ultra-vr-smell unit for 2020...
posted by sammyo at 5:44 AM on January 23, 2015


Virtual/augmented reality goggles are, for me, the equivalent of those flying cars people promised my parents. I've been expecting them forever, and the fact that they're actually here feels incredible. This will almost certainly disappoint, but I can't really see that, because it feels like so many early 90s sci-fi movies are coming to life.
posted by Bulgaroktonos at 5:46 AM on January 23, 2015 [2 favorites]


If this takes off my long range investment plan is to buy shares in motion sickness pill companies.
posted by srboisvert at 5:49 AM on January 23, 2015 [1 favorite]


No wireless, less space than a nomad, lame.
posted by empath at 5:55 AM on January 23, 2015 [1 favorite]


Metafilter: history has shown that the future has been better than we tend to fear.
posted by Sing Or Swim at 6:06 AM on January 23, 2015 [1 favorite]


"Today, when people say they want a “tablet” what they really mean is an iPad..."

I have never meant that.

I have a Kindle Fire, and it is not world-changing. It's kind of convenient to read 800-page books without having to carry them around, or (rarely) web surf in bed, or (very rarely) watch movies when I'm too sick to even bother going downstairs. But my life would not be significantly different if tablets did not exist, and there is nothing about an iPad that makes it any more desirable to me than the Kindle.
posted by Foosnark at 6:13 AM on January 23, 2015


Foosnark, I'm surprised you say that about the Fire. The ePaper Kindle is so much better for me (in terms of reading) that it's kind of impossible to imagine reading a long book on a typical tablet.
posted by sonic meat machine at 6:14 AM on January 23, 2015 [3 favorites]


My Android tablet is a million times better than my laptop for reading PDFs and cbzs and the like. One. Million. Times.
posted by Pope Guilty at 6:19 AM on January 23, 2015 [4 favorites]


Because if there's anything that could make Google Glass less of a douche-tech catastrophe, it's having it come from Microsoft instead.
posted by Naberius at 6:20 AM on January 23, 2015 [4 favorites]


I would prefer good speech-to-text, and then a small microphone (on a Bluetooth earbud?) that picked it up subvocally. That would save me from gorilla arms or from the need to finally learn to touch-type. :7)

(Yes, I know that separating metadata from data -- e.g., commands from the stream of other spoken words -- is difficult both for the computer and the user. But "Hey, Siri" (a.k.a. "Ahoy, telephone!") shows that this sn't insoluble.)

On an unrelated note, will goggles like this make wall-o'-glass displays -- like NASA Mision Control, or the NORAD Command Center in "War Games" -- a thing of the past?
posted by wenestvedt at 6:22 AM on January 23, 2015


No, wenestvedt, because wall-o'-glass displays are intended for quick monitoring of vital information. In DevOps offices, for example, you might have a giant display of your response times across a hundred servers. This is intended for "we are having a meeting in the conference room, but we can still glance out the window and see the giant map is red" uses, which this would not affect.
posted by sonic meat machine at 6:28 AM on January 23, 2015 [1 favorite]


If this thing comes out and you can't control it with your phone, Microsoft will have fucked up hard. Right now both of the proposed control mechanisms (voice and gesture) are nascent technologies that haven't gotten good enough to be useful in most use cases. If it comes out this year (like they are hinting), large parts of its functionality will likely be broken.

That said, I'm incredibly excited about it. Even if it's terrible when the first revision comes out, it's going to blow peoples minds. Most of the important technologies just aren't good enough yet, but might make the leap to viability within the next few years. If the hololens manages to fulfill half of it's promises, it will be incredible, and it will only get better.
posted by Rugglution at 6:33 AM on January 23, 2015


Kids. Lawn. Off. Meh.
posted by flabdablet at 6:35 AM on January 23, 2015 [1 favorite]


it's not like I go through life wishing I had a hologram-style overlay on the world around me

I have to wonder if maybe this is the difference between excited/not excited, because I totally go through life wishing that.
posted by jason_steakums at 6:37 AM on January 23, 2015 [18 favorites]


I put it in the same pile as iMax, 3D TV and augmented reality as a whole - it's authentically impressive, it feels like the future, and it does nothing for everyday life. You can pile up a lot of verticals for it - design, art, entertainment, training, medicine - which have been there for VR for as long as VR's been around (I think my first experience was around twenty years ago, on a visit to Georgia Tech's VR lab, which was profoundly impressive at the time. I wrote a short story about how it could be used in torture, which is quite the most gruesome I've ever done.)

You can say the same was true for tablets, which existed in various niches for a long time before breaking out, but while tablets do fit into everyday life in a reasonable fashion now they're small and cheap enough, they're not intrusive. You don't have to change modes to use them. It's telling that Microsoft's Shiny Future video that preceded the reveal of the Holowotsit - and I for one call fire and brimstone upon the head of Microsoft marketing for traducing a scientific term, and will not forgive that easily - was heavily biassed towards the magic and pixie-dust aspect of having apparently solid 3D objects projected into free space for unaided viewing. Not the subsequent demo, which was the same old slow-dancing-in-silly-face-furniture VR demonstration as we've had for decades. All it needed was day-glo pastels and a DX-7 synth-sax solo over Linndrums.

I don't doubt it's fantastic. So was flying on Concorde. I do doubt it'll be a significant part of everyone's life on a par with other personal tech for a very long time - and Microsoft has as much as admitted it with that promotional video. Until we get magic pixie-dust volumetric solid displays, this is going to be a need-to-have rather than a want-to-have, no matter how much we want to want it.

Having said that, I do expect to see some really good verticals in structural engineering, maintenance, surgery, remote sensing and data analytic visualisation. It'll be piecemeal, it'll be inaccessible and it'll be expensive, like a lot of niche tech, but a lot of people will have well-paid fun there.
posted by Devonian at 6:38 AM on January 23, 2015 [1 favorite]


Step 1: HoloLens
Step 2: "Dream Park"
Step 3: ???
Step 4: Profit!
posted by Standeck at 6:45 AM on January 23, 2015 [2 favorites]


"Today, when people say they want a “tablet” what they really mean is an iPad..."


Just the opposite, people often seem to refer to Android and Windows tablets as iPads. The name seems to be used generically for any kind of tablet.
posted by octothorpe at 6:53 AM on January 23, 2015 [1 favorite]


Before you turd or meh all over this... look back at a post that is a little less than five years old. In January 2010 another company announced an innovative new product.

If a significant number of metafilter commenters thinks a new piece of tech is ridiculous, that's been my clue that it will be epochal for years. From that thread. I'd dig back further, but I recall similar poo-pooing about the ipod, and I bet if we went back far enough MeFites would be pointed at the sky and mocking airplanes.
posted by maxsparber at 6:58 AM on January 23, 2015 [2 favorites]


For me it's just that there has been so much excitement about VR/AR prototypes in the last few years, particularly in the gaming space, that I've just reached the point where I want to see at least one product actually hit retail shelves (and gauge the reaction to that) before I get excited about it.

And then of course there's the history of motion control to look at... from revolution to fad to irritant in a few short years.
posted by selfnoise at 7:03 AM on January 23, 2015 [1 favorite]


I will say as a white collar office drone I actually think something AR would be a really cool replacement for my monitor... I could dynamically organize virtual monitors all around my desk instead of papers!

It's terrible but that actually gets me more excited than the gaming applications.
posted by selfnoise at 7:07 AM on January 23, 2015 [5 favorites]


I'd dig back further, but I recall similar poo-pooing about the ipod, and I bet if we went back far enough MeFites would be pointed at the sky and mocking airplanes.

And do you think threads about tech that actually failed would be free of poo-pooing?
posted by escabeche at 7:12 AM on January 23, 2015 [5 favorites]


Oh, that reminds me. I'm sick of people reading my screen as they walk by. With this, nobody would see my screen (because I'm password-protecting it, that's why, so they can't see it with their Holowhatsits)!
posted by Mogur at 7:13 AM on January 23, 2015 [1 favorite]


I've actually used the Oculus DK2 and it's beyond impressive, though I don't think it will be a big mainstream success in its current state. It's too heavy and the graphics aren't quite high resolution enough, but those are both solvable problems. Drop the weight by 30% and double the resolution, and you have, at the very least, the next Wii, and at best the next iPhone.

These AR and VR goggles are going to be world changing, eventually. Maybe on the scale of 5 years rather 2, but definitely soon. And I don't think anyone here has the vaguest clue what they'll be used for beyond porn and video games.

I'll tell you this much, though. I'll pay $1000 for a hololens if its lets me play tabletop 3-d Starcraft.
posted by empath at 7:14 AM on January 23, 2015 [1 favorite]


Also, one of the things that hololens can do, which I thought was pretty neat, is exempt certain things in the world from being 'blacked out', so you can have a computer desk on mars, or whatever. One of the articles I read about it described how the lens actually projected a mouse cursor onto the computer screen, that he could control with an actual mouse, but when he moved the mouse cursor off the edge of the monitor, it just kept going into the real world.
posted by empath at 7:18 AM on January 23, 2015 [3 favorites]


The Future: Fuck You if You Can't See in 3D!
posted by entropicamericana at 7:20 AM on January 23, 2015 [1 favorite]


In the future, instead of wiping your feet or taking off your shoes when you enter someone's house, you will instead accept and install their "beautiful, clean house interior" patch.
posted by 445supermag at 7:22 AM on January 23, 2015 [12 favorites]


I recently picked up Vernor Vinge's Rainbows End, which is more or less several hundred pages of "what could you do if everyone had one of these?" One of the more moving sections involves a teenager who designed a personal overlay for her grandmother's nursing home, transforming it into an ancient forest with her nana as witch-queen.

Personally I'm pretty excited about the holodog that shows up around 2:00 in the promo video.
posted by theodolite at 7:43 AM on January 23, 2015 [1 favorite]


So, will this make cubicle life worse or better? Even more like battery hens, or spirits flying free from their corporate cages?
posted by Devonian at 7:48 AM on January 23, 2015


The Future: Fuck You if You Can't See in 3D!

Shouldn't impact your enjoyment of it at all. Headtracking is the real magic here, not depth perception.
posted by empath at 7:48 AM on January 23, 2015


Games. I think I would like 3D games. Anything else? I dunno. Have Apple look into it.
posted by uraniumwilly at 7:52 AM on January 23, 2015


fwiw, oliver kreylos -- an 'immersive 3D computer graphics' developer/researcher -- provides a nice taxonomy of holographic displays: "A holographic display is a system that creates the visual illusion of solid three-dimensional objects by recreating depth cues 1 through 5 for at least one viewer at a time."
  1. Perspective foreshortening: farther away objects appear smaller
  2. Occlusion: nearer objects hide farther objects
  3. Binocular parallax / stereopsis: left and right eyes see different views of the same objects
  4. Monocular (motion) parallax: objects shift depending on how far away they are when head is moved
  5. Convergence: eyes cross when focusing on close objects
  6. Accommodation: eyes' lenses change focus depending on objects' distances
also btw, re: input methods, check out his CAVE!
Messing around with 3D video - "We had a couple of visitors from Intel this morning, who wanted to see how we use the CAVE to visualize and analyze Big Datatm." :P
posted by kliuless at 7:56 AM on January 23, 2015


And I don't think anyone here has the vaguest clue what they'll be used for beyond porn and video games.

What else do we need?
posted by davros42 at 8:02 AM on January 23, 2015 [1 favorite]



I'm pretty excited about this, or at least the possibilities as I can see them. I would love, love, love to work in some sort of 3 D dimension. I constantly have to translate the 3d pictures and systems in my head to 2D for work. If they can create some sort of 3 dimensional mind mapping like tool I will pee my pants. I pee my pants just thinking about even the possibility of being able work and create in such an environment. Also if this sort of work can be done standing up and even moving around....

I am a proprioceptive learner and thinker and do, so, so much better when I'm able to move around.
posted by Jalliah at 8:25 AM on January 23, 2015


Oh and Sherlock's Mind Palace type tool.

Please, pretty please?

Anyone want help with some visioning of this because I'm all in, if it's possible.
posted by Jalliah at 8:28 AM on January 23, 2015 [1 favorite]


If Microsoft can get the price of HoloLens right, it could become the must-have Minecraft accessory at Christmastime.
Me? I'm just looking forward to the Black Mirror episodes about this.
posted by fullerine at 2:04 AM on January 23 [18 favorites −] [!]


What if phones, but too much?
posted by emelenjr at 8:43 AM on January 23, 2015 [3 favorites]


The comparison to google glass and glassholes is mistaken - this is clearly a tool for home or office, not something you'd wear all the time.

I'm curious to know whether its a 3d effect or a flat transparent LCD. Either way, making a transparent screen onto which arbitrary things can be projected is no mean feat.
posted by memebake at 8:56 AM on January 23, 2015 [2 favorites]


Shouldn't impact your enjoyment of it at all. Headtracking is the real magic here, not depth perception.

This is what I was wondering. My dad has only sight in one eye, so he can't enjoy things like 3D movies. Is this a different technology entirely that wouldn't have the same limitations?
posted by SpacemanStix at 9:02 AM on January 23, 2015


Although I can't imagine how gestures would possibly not suck, I would be all over augmented reality Minecraft. Like Legos* but with flowing lava and critters and monsters and mine carts running round!

But watching the HoloLens promo video I'm having a hard time getting past the asshole wearing a big goofy visor in his kitchen so he can watch a fucking virtual television screen. They didn't add the part where augmented reality TV commercials float 12 inches in front of your face, block your entire field of vision, and can't be dismissed.

* Yes, pedants, I said Legos.
posted by usonian at 9:13 AM on January 23, 2015


Tweaking depth of field/focus differentially in a moving image can produce a pretty convincing illusion of 3d, especially if reinforced by some visual depth cues.

Split depth gifs can trick even single eyed folks, apparently.

Some examples: Cat, Seal.
posted by bonehead at 9:15 AM on January 23, 2015


At first it was only a game. How naïve.

GoGGle (tm) me to your dream, then send us to the Large Array.

Broadcast us to the La Grange transponder, then repeat, repeat and repeat.

We will meet us there.
posted by mule98J at 9:17 AM on January 23, 2015


This is what I was wondering. My dad has only sight in one eye, so he can't enjoy things like 3D movies. Is this a different technology entirely that wouldn't have the same limitations?

I mean he obviously wouldn't see it in true 3d, but it would be the same as looking at the real world with those objects in it. He can use parallax and size cues as he moves to determine depth, the way he does for anything else.
posted by empath at 9:21 AM on January 23, 2015


I would LOVE this for music production.
posted by feckless fecal fear mongering at 9:29 AM on January 23, 2015 [2 favorites]


I would love this for music consumption, streamed live shows would be rad, the illusion of just you and the performer in a room when it's going out to thousands of people.
posted by jason_steakums at 9:35 AM on January 23, 2015 [3 favorites]


Fantastic! I was just thinking how desperately hungry the world is for dazzling new entertainment technologies that blur the boundaries between fiction and reality. This is going to make so much difference to so many real people urgently in need of distraction from how their leaders are selling them out.

Sorry I can't help snarking. The tech looks neat, but what does it actually amount to? A new way of looking at digitally rendered images? We're getting a little too on the nose with the whole valuing appearances over reality thing aren't we?

Ah hell, probably just this cold bug making me cranky, but I just can't get excited about this stuff anymore. Tech burnout syndrome, I suppose.
posted by saulgoodman at 9:36 AM on January 23, 2015


Absolutely no holography involved in this thing either as far as I can tell.

Sheesh. The word "hologram" has been widely used to refer to a projected image that appears to be a 3D object in the world for at least 30 years. Unless you've got a better, catchier word for Princess Leia begging for Obi Wan Kenobi's help, people are gonna call them holograms.
posted by straight at 9:38 AM on January 23, 2015 [1 favorite]


Ooh and the performance art and theatre you could see with this. Oh man. It has the potential to do some really good things for all kinds of intimate physical artistic performances. You get like a ring of Kinects around the performance space and broadcast all of that data so that anyone watching can walk in and around it.
posted by jason_steakums at 9:40 AM on January 23, 2015


Sorry I can't help snarking. The tech looks neat, but what does it actually amount to? A new way of looking at digitally rendered images? We're getting a little too on the nose with the whole valuing appearances over reality thing aren't we?

Imagine saying that while looking at us all sitting in the same room together. Or shooting us with lasers. The snark possibilities are endless.
posted by SpacemanStix at 9:40 AM on January 23, 2015


I hope this comes out. And I hope MS aren't total dicks and lock it to Windows only.
posted by rifflesby at 9:42 AM on January 23, 2015


It's not exactly clear that there isn't holography happening here. At the very least, there seems to be something going on in the lenses beyond an lcd screen:

Each lens has three layers of glass—in blue, green, and red—full of microthin corrugated grooves that diffract light. There are multiple cameras at the front and sides of the device that do everything from head tracking to video capture. And it can see far and wide: The field of view spans 120 degrees by 120 degrees, significantly bigger than that of the Kinect camera. A “light engine” above the lenses projects light into the glasses, where it hits the grating and then volleys between the layers of glass millions of times.
posted by empath at 9:48 AM on January 23, 2015 [1 favorite]


They didn't add the part where augmented reality TV commercials float 12 inches in front of your face, block your entire field of vision, and can't be dismissed.

I feel like a whole lot of SF authors missed (and are still missing) the cellphone/Facebook phenomenon.

Yes we'll have all that cool technology, but we won't own it, and it won't be designed with our needs, interests, and desires as the first priority.
posted by straight at 9:49 AM on January 23, 2015 [5 favorites]




500 foot tall projections of luchadores stomping all over your city as they wrassle (broadcast live from halfway around the world), random encounters with dragons or aliens or Batman or whatever to spice up your daily life, watching a movie on THE ENTIRE SKY, seeing strange gas giants hovering over Earth as if we're on a distant moon, and only you can see any of it unless you want to share it with someone. Tap into collaborative art feeds where people virtually change the environment around them. Have grocery stores upload models of their layouts so you can find the things you're looking for with x-ray vision. Play as Han or Luke in the Millennium Falcon turrets shooting down TIE fighters with the aid of a swiveling office chair. I'd wear doofy goggles in public and wave my arms around for that.
posted by jason_steakums at 9:53 AM on January 23, 2015 [5 favorites]


One thing to note is that they seem to be using some kind of dynamic zone plating so that your eye focus works the way it works in the real world, rather than focusing on the screen, you focus on where the virtual object is supposed to be.
posted by empath at 9:54 AM on January 23, 2015 [2 favorites]


Yes we'll have all that cool technology, but we won't own it, and it won't be designed with our needs, interests, and desires as the first priority.

No, this was explicitly covered in Gibson's first three books - the technology has its own purposes that are sometimes aligned with humanity's, most often orthogonal, and on the rare, terrifying occasion, at odds. In these strange days, we're all horses for the loa to ride, embedded in our deepest cookies.
posted by Slap*Happy at 9:55 AM on January 23, 2015 [2 favorites]


I think that whole line of reasoning falls under the heading refusing to "own it."
posted by saulgoodman at 10:02 AM on January 23, 2015


Sure, and some people refuse to own TVs, and computers, and cars, etc, etc. Doesn't mean their life isn't impacted by it.
posted by empath at 10:03 AM on January 23, 2015 [1 favorite]


I hate using touchscreens for anything but simple tasks and I'm sure I'll hate gestures for all the same reasons. I need haptic feedback to compensate for the uselessness of my fat, clumsy fingers.

But in theory, for a smart enough system, any physical object could be used as a haptic input device. For example, place a pen on a desk and have the system lock in on it. Rotating the pen rotates the virtual object onscreen. Or pick up the pen and use it to "draw" on the desk. Hell, you could use a traditional keyboard and mouse (unplugged!) if you really wanted to. As long as the system can identify an object and interpret your interaction with it, it'll work as an input device.
posted by dephlogisticated at 10:43 AM on January 23, 2015 [3 favorites]


No, this was explicitly covered in Gibson's first three books - the technology has its own purposes that are sometimes aligned with humanity's

The tech finds its own uses for...itself.
posted by The Tensor at 10:49 AM on January 23, 2015


the technology has its own purposes that are sometimes aligned with humanity's

I'm thinking of the more mundane sense where it's owned and serves the purposes of the company that makes it rather than by and for the end user. Like that Chiang short story I linked above that assumes constant searchable video of one's life will be owned by the user like a camera rather than some service Google offers for "free" when we agree to let their ad robots search it.
posted by straight at 10:51 AM on January 23, 2015


I'm beyond excited for VR gaming, and unless reviewers pan the commercial models, I'll almost certainly be an early adopter of the Rift, or the HoloLens, or whatever device is most viable/affordable.

But this kind of thing does give me pause. It's getting easier and easier for people to retreat into their own virtual worlds, and away from the real one (and the other people that live in it). That scares me a little bit.

If wearable AR becomes affordable, unobtrusive, and socially accepted enough to become as common as smartphones—and there's no reason to suppose it won't—you'll have large numbers of people whose daily life isn't about experiencing reality, but about experiencing reality as mediated by Google, Facebook, etc.. And by a host of their own apps and settings that customize the world to their liking.

Which is arguably already the case, to some extent—but AR will take it to a much more primal, fundamental level.

(You know that scene in Futurama where Fry has advertisements beamed into his dreams? Yeah. Not that farfetched. Except the advertisers will have unprecedented detail about our behavior to data-mine, down to our unconscious eye movements. Did your eyes linger on a certain pair of pants at the store? Did you stop for 40 seconds next to a poster for an upcoming concert? You'll get ads for it the next day—and it'll be added to an ever-growing profile of your tastes, habits, and demographics.)

On the other hand, it'll be cool to install the app that makes everyone else on the sidewalk look like they have duck heads, or whatever.
posted by escape from the potato planet at 11:03 AM on January 23, 2015


Did your eyes linger on a certain pair of pants at the store?

I have to confess, an overlay that shows arrows pointing at things with information like "this will actually fit you" and "this pair won't make your ass look fat" would be extremely useful.

Grocery shopping, too.
posted by feckless fecal fear mongering at 11:06 AM on January 23, 2015 [1 favorite]


Being able to walk around with these is a long, long way off, I think. A lot of the functionality depends on scans of the room being pre-loaded. I don't think it can do stuff like table-top minecraft on the fly.
posted by empath at 11:11 AM on January 23, 2015


It does read from the Wired article that they may be locking it tot he Windows system, however you choose to define that. Which will be interesting to watch pan out, if it's successful, in terms of what IP they have and how they choose to use it.
posted by Devonian at 11:11 AM on January 23, 2015


sonic meat machine: ...wall-o'-glass displays are intended for quick monitoring of vital information.

Oh, no, I get that: I have plenty of real estate devoted to Nagios and Orion and MRTG droppings. :7)

I just mean that you could offer a lot of the same data to those people, without needing the stadium-sized room, if every NOC staffer was in his little beige cubicle with the big displays only virtually present.
posted by wenestvedt at 11:21 AM on January 23, 2015


Imagine if it eventually shrinks to the size of contact lenses.
posted by umbú at 11:33 AM on January 23, 2015


It does read from the Wired article that they may be locking it tot he Windows system, however you choose to define that.

The Hololens itself runs a branch of Windows 10 as it's OS. With the amount of custom silicon that's likely to be in the Hololens, I would think it would probably take a herculean effort to port another OS to it. It's not like the Rift, which is essentially a fancy heads-up display attached to your PC - it's a self-contained computer.
posted by Fidel Cashflow at 11:35 AM on January 23, 2015


The Hololens itself runs a branch of Windows 10 as it's OS.

The Hololens is years away from release, it's not like this is set in stone. Also, I seriously doubt that they're going to be using much custom silicon-- even their flagship Xbox is now made out of essentially PC hardware. The "holographic processing unit" sounds like marketing nonsense and is probably just an NVidia or ATI GPU. So I do not see any fundamental technical obstacle to being able to run e.g., Linux on it.

As I said about the Magic Leap, there's a big gap between what you can make if you aren't concerned about cost or size or power (like their current demo unit which is tethered to power and has basically a laptop that you have to carry), and what you can make (profitably) for less than $1000 and which has to run on batteries (keep in mind it also has to power a kinect) and fit comfortably on a head for periods of time longer than a fifteen minute demo session. You can get this type of display (a holographic waveguide) from Vuzix today, but it'll set you back $6k for one eye. Maybe Microsoft will be able to get this price down, but precision optics have always been expensive.
posted by Pyry at 11:59 AM on January 23, 2015


The Hololens is years away from release, it's not like this is set in stone.

I'm just going by what Microsoft said at the press conference - it runs a version of Windows 10.

The "holographic processing unit" sounds like marketing nonsense and is probably just an NVidia or ATI GPU. So I do not see any fundamental technical obstacle to being able to run e.g., Linux on it.

I had read that Microsoft claimed that the HPU is separate from the CPU and the GPU. Doing a little more googling, some people seem to think that the HPU is just an Intel Atom 'Cherry Trail' chip. If that's the case, then you're right, there really shouldn't be any massive technical challenges to getting linux to 'run' on the thing, but getting a comparable user experience would probably be a herculean effort.
posted by Fidel Cashflow at 12:34 PM on January 23, 2015


The Hololens is years away from release

They're shooting for dev kits to be out this year, and it's supposed to be released "within the Windows 10 timeframe," which seems to point to this year or next (assuming no delays).
posted by kethonna at 12:38 PM on January 23, 2015


The "holographic processing unit" sounds like marketing nonsense and is probably just an NVidia or ATI GPU.

No, I don't think that's the case. There is more to this than projecting an image on a screen. It appears to be using waveguides and defraction grating to actually alter the apparent source of the light so it doesn't look like it's floating in space in front of you. It's also altering the opacity of the screen to block out incoming light selectively. I don't think that's the kind of thing that you can just do with an off-the-shelf gpu. It's probably using an Nvida board or whatever to generate the actual 3d graphics, but there's more going on with these glasses than that.
posted by empath at 1:39 PM on January 23, 2015 [1 favorite]


They're shooting for dev kits to be out this year,

Yeah, but from all the descriptions, the gear they have now is huge, bulky, uncomfortable and hot. I doubt they'll be able to miniaturize to the point that consumers can do something with it for years. Comparatively, oculus is much simpler tech, and they're still probably 9 months to a year away from release.
posted by empath at 1:41 PM on January 23, 2015


I hope I'm just being cynical or paranoid, but I can't help but feel that all the cool consumer demos of AR, VR, and wearable technologies are Trojan horses for the business use cases that would eventually drive the real profits on these devices: extending user/consumer tracking and analytics from the online and point-of-sale spaces to all face-to-face interactions.

How much do you suppose that, for example, Best Buy would pay if they could have every salesperson wear fairly discrete glasses that recognize your face and superimpose your name, recent purchase history, perhaps any notes dictated by the last salespeople you spoke with? If they could see that the last five times you visited the store across town and you really were just looking, but the last time you were in this one you spent 1500 on a new iMac? How much more would they pay if they could add similar data from other organizations, physical and online, as provided by a big data broker?

Heck, I'm just a consultant, but I'd drop a big chunk of a year's salary on a device that could attach my notes about a person to their face and voice better than my memory can in the first weeks of a project.

The reactions to Google Glass demo units make me hopeful that people really do want some degree of privacy in their face to face interactions. Maybe wearable privacy is the line in the sand that consumers will finally, actually defend against the incursions of marketers. But I had the same hopes seven or eight years ago about the commercialization of online social networks: people concerned about Facebook privacy are largely dismissed as paranoid. Seven or eight years before that I thought that DoubleClick trying to snoop on everything you read online would clearly be intolerable to anyone with sense: now they essentially own Google.

It's clearly no coincidence that one of the biggest organizations to display interest in current AR tech is Salesforce. I know that the current version of Oculus is all screen and no input, but it's very hard to believe that they were bought by the company with the most mature face-recognition technology (and by far the biggest database of faces) in the world, without that fact entering into anyone's business plans. I hope I'm wrong, about this, and that Minecraft and NASA telepresence turn out to be the AR technologies that change the world. But I really doubt it.
posted by CHoldredge at 2:09 PM on January 23, 2015


People aren't going to walk around with these things or google glass anytime soon (within the next ten years)
posted by empath at 2:23 PM on January 23, 2015


Yeah, but from all the descriptions, the gear they have now is huge, bulky, uncomfortable and hot.

I'm not so sure, these hands-on reports you're seeing from Arstechnica and Wired seem to be from a while ago (Wired mentions their demo happening last October) and have presumably been embargoed until now. The on-stage demo they did at the Windows 10 event this week shows someone wearing the functioning glasses and they look the same size as they do in the glitzy PR images.

The on-stage demo is a bit curious, you see some 'through the eyes of Laura' footage and it looks pretty good, but then they switch to a third-person view for most of the 3d-modelling demo. Not sure why they did that - the first person view was not impressive enough or something? Perhaps because Laura's outstretched arm will always appear behind the holograms that are supposed to be several metres away, thereby ruining the effect? (you see this happening when she launches the app when they still have the first person view on).
posted by memebake at 2:28 PM on January 23, 2015


They're shooting for dev kits to be out this year, and it's supposed to be released "within the Windows 10 timeframe," which seems to point to this year or next (assuming no delays).

The Google Glass consumer edition was supposed to be released at the end of 2013 for less than $1,500, and neither of those things came true. And the Glass is technologically much simpler than the Hololens.

There is more to this than projecting an image on a screen. It appears to be using waveguides and defraction grating to actually alter the apparent source of the light so it doesn't look like it's floating in space in front of you.

I think Microsoft is capitalizing on confusion about technical issues on this point.

I don't know of any better way to try to explain this than with a fake dialogue, so just bear with it.

Q: Why is the Oculus rift so bulky? Why couldn't it just put one screen very close to each eye? Why add weight and volume and complexity with those lenses?
A: Put your finger right up to your eye and try focusing on it-- even if you can do it, it's extremely tiring. The collimating lenses in the oculus rift optically project the screen so it seems much farther away than it really is.

Q: Why do these AR glasses always have such complicated displays (Vuzix, the Glass, the Hololens)? Couldn't they just have an LCD screen without a backlight right in front of your eye?
A: AR glasses have the same focus problem as the rift, but worse, since you couldn't both focus on a nearby screen and the more distant real world. Either the real world or the display would be blurry. So they also have to optically project their screens farther away.

Q: Just put lenses between the screen/projector and the user's eye like the rift does.
A: That would move the screen optically farther away but it would also distort the image of the real world.

Q: Have the screen/projector and the collimating lenses off to the side or above, and then have a half-silvered mirror or prism in front of the user's eye to combine the real-world's light and the display's light.
A: This is the approach that many AR glasses take, including the Google glass.

Q: So what's the problem?
A: It's bulky, and it just gets progressively bulkier the larger you want the field of view of the display to be. This is why the Google Glass's display is so tiny.

Q: So what do these diffractive and holographic waveguides do?
A: It's a problem of moving light rays from the display/project and into your eye in the right configuration. Lenses (refraction) and mirrors (reflection) are one way of modifying light rays, but diffraction is a third. Basically (and this is a gross oversimplification and a metaphor) imagine that you had a tiny set of mirrors for every pixel/ray of the source display, and you used that to individually direct every pixel/ray into exactly the right position and direction in front of your eye. Then the whole apparatus could be made as thin as a sheet of glass. (see the figures on this page to get a better idea).

Q: What does this have to do with holograms (i.e., those flat images that have a 3d effect)?
A: Holographic waveguides use holographic gratings (which are just a type of diffraction grating) to split up the incoming light from the projector into different colors channels on one end and then again to combine the color channels on the other end. They don't really have much to do with what are commonly thought of as 'holograms' except on a technical level.

Q: Then how does the hololens produce images that really seem to be there?
A: The same way the rift does, just regular old stereoscopy, presenting each eye with a different view of the virtual scene.

Q: Couldn't the Hololens have a light-field display?
A: Lightfield displays exist, so it's possible. For a VR/AR display, the main advantage a lightfield display would have is that you could optically focus (see the next question) your eyes on different parts of the scene, which is not possible with non-lightfield displays which are optically at a fixed focus (usually infinity for ergonomic reasons as covered earlier). But the major issue with lightfield displays is that a lightfield display has to trade off apparent resolution in order to get the lightfield effects. For example, the Lytro (which is a lightfield camera, but the tradeoffs are symmetric) has an 11 megapixel sensor (~3300x3300 pixels) but only produces 1.1 megapixel images (1080x1080). To make a 1920x1080 lightfield display of the same relative lightfield quality as the Lytro would require a backing display with a resolution of ~5700x3200. Selective optical focus would not seem to be worth such an enormous resolution cost.

Q: Isn't being able to focus on closer and farther away parts of the scene important?
Here unfortunately the English language uses the word "focus" to mean two different but related physiological processes. One type of focus is what I'll call optical focus, which is controlled by changes in the shape of an eye's lens, or in a mechanical camera, but movement of elements within the lens. Close one eye, and hold your finger up to the open eye and focus on it, and you'll notice the background becomes blurry. If you focus on the background instead, your finger will become blurry. This is optical focus. The other type of "focus" is where, with both eyes open, the eyes will rotate in the sockets to bring the point of focus into the same spot in each eye. This is frequently called convergence. To see this effect, keep both eyes open and hold a finger in front of your nose and "focus" on it. You'll notice that the background now splits into a double image. If you focus on the background instead, you'll get a double image of your finger.

The Oculus rift and equivalent displays which show each eye a different image will let you focus by convergence, but you need a lightfield in order to be able to optically focus. But it turns out that optical focus isn't especially important-- people have their lenses removed for cataracts all the time and live without significant difficulties. And in a bright scene where your pupils are contracted your depth of focus is so large anyway that optical focus doesn't matter much (a ideal pinhole camera would not have optical focus at all-- everything would be perfectly in focus).

Q: So is optical focus then completely useless?
A: Because we're used to focusing both optically and through convergence at the same time, having your optical focus and convergence focus in different places (as with the rift) probably causes some eyestrain. So it would be useful to have optical focus for that reason.

Q: I didn't read any of that. Can you summarize it succinctly?
A: The Hololens works on the same basic principles as the oculus rift, but making a semi-transparent display that is comfortable to put right in front of your eyes is much harder than the opaque display of the rift.

It's also altering the opacity of the screen to block out incoming light selectively.

Yes, I think this is true from the reports. But an LCD shutter doesn't require some kind of exotic computation.

I don't think that's the kind of thing that you can just do with an off-the-shelf gpu.

Even if they were actually doing a lightfield display, it's still the kind of parallel rendering that modern GPUs are exceptionally powerful at, and so an off-the-shelf GPU would actually be your best bet for doing that kind of thing at interactive framerates.
posted by Pyry at 2:57 PM on January 23, 2015 [5 favorites]


Let me rephrase part of that:

Q: What does this have to do with holograms (i.e., those flat images that have a 3d effect)?
A: Holography is about using light interference to encode light waves. Holograms use this to make a record of light reflected off an object so that that light can be recreated later. In a holographic waveguide, holographic elements are used to encode and decode light from a collimated projector so it can be more easily transported in front of the user's eye where it is recreated. The ultimate point of this is to present an image that seems to be 'at infinity' for ergonomic reasons.
posted by Pyry at 3:28 PM on January 23, 2015


Tablets have three big advantages right now. Ease of use, battery life, and performance per dollar. a $130-whatever nexus 7 gen 2, or $200-250 ipad mini will outperform any new computer in that price range, run for far longer on battery, be easier to transport, and easier to pick up and use or train/teach someone on.

I couldn't figure out what one of the big "silos" of this would be, other than gaming, but...

Employee training would be my killer app (yeah, okay the schools can use it, too -- take the kids inside a sailing ship and let them fire the virtual cannon). But actual training space set up with whatever fixtures and fittings will emulate the working space is hella expensive -- especially when the damn workspace changes completely every couple of years -- so employees in branch offices all have to be shipped to the nearest training center all the damn time. In northern Canada, this is both expensive and very season-dependant.

On one hand, this is fucking awesome. Imagine ifixit but all the manuals are formatted for this. You simply set the device you need to fix in front of you, and it auto loads a list of manuals for replacing various components. Select the one you want, and it overlays each step right on the device while keeping track of each piece you've removed and your tools.

On the other hand, i could see this turning many jobs in to awful amazon warehouse type bullshit where the system doesn't want to trust the user. "Oh, you're saying that item isn't on this shelf? Now show me EVERY item on that shelf before you can proceed, and remember, clocks running!".

It could make an awful lot of jobs essentially unskilled labor, or vastly lower the bar to entry... but it could also make a lot of jobs really shitty.

Oddly enough, what i'm most excited about now that i slept on this is art. Digital art is coming in to its own as it is, and this could be an entire new medium unlike anything possible before. Art and games made for this that become classics will likely be stuff we still load up on our holodecks from the equivalent of goodoldgames in 20-40 years.

On the one hand yes, on the other... i can sort of see 'reaching out to touch the imaginary thing you can see in your visor' as a very natural haptic event.

Except you're not touching anything, you're the one holding your hand in that position rather than letting it "grip" something or just rest. If this was paired with some kind of force feedback gloves, or even gloves + some kind of under armor type sleeves it would make more sense to me.

I understand the arm position arguments made above, but if you're setting it to minimum movement anyways... how accurate can it really be? how is that superior to using a mouse, or an xbox controller, or something like a wiimote/playstation move type controller and just mostly using wrist or small arm movements?

I've played with kinect, and i've yet to see a system like this that wasn't jerky at detecting small movements. These systems seem well suited to larger movements, which are exactly the kind that are tiring and gorilla-arm-y. I know that the problem here isn't literally exactly the same as gorilla arm, but waving your arms around in the air with no haptic feedback is still not a super amazing input method.

It's a great secondary input method, but i view it as something like siri/cortana/google now. I really hope this isn't designed with the expectation of that being the primary way you'll use it, was mostly my point. I think it's quasi-DOA if it is, in the same way kinect was.
posted by emptythought at 4:08 PM on January 23, 2015


Oddly enough, what i'm most excited about now that i slept on this is art. Digital art is coming in to its own as it is, and this could be an entire new medium unlike anything possible before.

Whichever Gibson novel that included Blueant was about exactly this technology doing exactly this thing, at least in part.
posted by feckless fecal fear mongering at 4:18 PM on January 23, 2015


I like the possibility that you could feel like you are hanging out with people or friends in the same room. Chat rooms and MMOs and Skype have shortened distances between people in a way that feels more tangible, but I think it would really feel cool to feel like you are actually in the "same place" as other people, even if those representations were just avatars.
posted by SpacemanStix at 4:35 PM on January 23, 2015


Pyry, magic leap says they are using lightfields.
posted by empath at 5:16 PM on January 23, 2015


Pyry, magic leap says they are using lightfields.

It's definitely possible-- it just requires a regular display and a microlens array. But as I said earlier, a lightfield display has to trade away resolution and I don't think that's a worthwhile trade at this point in time. If I had a choice between a 5700x3200 resolution HMD without lighfield and a lightfield 1920x1080, I'd take the 5700x3200 without hesistation. As it stands, the Rift DK2's 1920x1080 is just barely usable, and it's critically important to improve that. The variable optical focus that lightfields would give you is something that we can worry about in the future when HMD display resolutions have already hit the point of imperceptibility.
posted by Pyry at 6:04 PM on January 23, 2015


Strange Days indeed.
posted by ostranenie at 7:12 PM on January 23, 2015


Pyry, pixel density is just going to keep rising, and at some point we're going to use it not for higher resolution but for light fields. I saw an 8K autostereoscopic display at CES (110" Samsung) and it's the first one I've ever seen start to work -- no banding!
posted by effugas at 7:45 PM on January 23, 2015 [1 favorite]


Also, I don't see why you'd need to project the entire light field, but I might not understand how they work. Is there any reason you couldn't project each pixel at the correct depth in the light field?
posted by empath at 10:44 AM on January 24, 2015


empath,

Pixels are historically light sources that project the same light in all directions. Light fields require pixels that send different photons in different directions. Mirrors do this, which is why you look at a screen but see through a mirror. Generally the two strategies are either spatial multiplexing -- smaller pixels, each directed separately through microlens arrays -- or temporal multiplexing -- high framerates with something like an LCD filtering light to certain angles. I've seen a light field display doing the temporal approach and it's weird, there's just...a thing there...when you move, reflections move with respect to your head.

The actual biggest problem with light field displays, as opposed to glasses, is that you stop being blind to the size of the projected object. A small 2D image is just a small 2D image but a small 3D car is Hot Wheels. To say nothing of people.
posted by effugas at 1:37 PM on January 24, 2015


I have a very hard time remembering names. It would be delightful to have AR glasses able to do facial recognition and remind me of that and other relevant info about a person. Preferably in a non-creepy "here is what they have personally told me" way and not a "here is what the Googles/Bing/whatever knows about them". It would also be of use in displaying information about products in stores (like the Google Glasses app) or keeping a running total of the cost of everything in your shopping cart in the grocery store.

On the road, it could recognize and highlight other vehicles about to cream you that you might not have noticed otherwise, provide reminders of speed limits, and other useful things. Other people might be able to use it to keep an actually reliable food diary. The real question is whether there's enough processing power to rapidly perform complex image recognition on board or whether that would have to be shipped off to the cloud. It will certainly be able to recognize bar codes and other patterns intended to be machine readable, but that drastically limits the use cases.

There is a lot of shit AR would be good for in day to day living, so I for one welcome our Microsoft HoloLens overlords. With luck, I won't even be the one who has to write the goddamned apps to make it work like I want, but if I have to so be it. I have been waiting decades for a decent and semi-affordable AR rig. TBH, I would use it daily in its present prerelease state.

So yes, where do I sign up? The rest of you can call me a glasshole all you like, I'll still be happy that now I can remember who the fuck you are.
posted by wierdo at 2:34 PM on January 24, 2015 [1 favorite]


Bleh, I meant the Google Goggles app, not Glasses. Silly me.
posted by wierdo at 6:25 PM on January 24, 2015


I have a very hard time remembering names. It would be delightful to have AR glasses able to do facial recognition and remind me of that and other relevant info about a person.

Actually the Google Glass developer policies explicitly prohibit you from doing this:

Don't use the camera or microphone to cross-reference and immediately present personal information identifying anyone other than the user, including use cases such as facial recognition and voice print.
posted by Pyry at 10:48 AM on January 26, 2015






« Older aphextwin   |   What is machine language? Newer »


This thread has been archived and is closed to new comments