Intel smart glasses
February 5, 2018 3:32 PM   Subscribe

Hands-on with Intel's new smart glasses. By shining a low-powered laser into your retina, the glasses can get all sorts of information without pulling out your phone. previously
posted by adept256 (79 comments total) 14 users marked this as a favorite
 
"you can ignore people more efficiently that way"

horray!
posted by numaner at 3:41 PM on February 5, 2018 [2 favorites]


Huh, interesting: Bloomberg - Intel Is Said to Plan Sale of Majority Stake in AR Glasses Unit
Intel has pared back some of its consumer product efforts after failing to make progress in the market for wearable technology. As part of that retreat, it shuttered the Recon augmented-reality goggles business that it acquired in 2015. Some former members of the Recon team are part of the division up for sale. It has about 200 employees in the U.S., Switzerland and Israel.

Intel intends to attract investors who can contribute to the business with strong sales channels, industry or design expertise, rather than financial backers.
posted by Existential Dread at 3:45 PM on February 5, 2018 [3 favorites]


Watching this type of technology from the sidelines, one of the big issues with miniaturizing these types of displays is how to efficiently get the information onto your retina from such close distances. One of the big risks is the power of light used, and whether you risk permanent damage from lasers/close-up LEDs/etc. Interesting that Intel has found a way to use layers with a single reflective surface direct to the retina; I wonder if they had to get regulatory approval from the FDA. A quick scan at fda.gov isn't turning anything up.
posted by Existential Dread at 3:57 PM on February 5, 2018 [2 favorites]


People were writing articles about goggles shining low-powered lasers at your retinas in the 1990s and I thought for sure we'd have them by now. If I remember correctly, they even had a rig back then that could project VGA (640x480 pixels). Seems like the engineering challenges were greater than anticipated.

Here's hoping we will see a product like this soon. Today's virtual-reality rigs are far too cumbersome to be used to augmented reality.
posted by Triplanetary at 3:58 PM on February 5, 2018 [1 favorite]


By shining a low-powered laser into your retina...

NopeNopeNope...
posted by Thorzdad at 3:59 PM on February 5, 2018 [10 favorites]


"you can ignore people more efficiently that way"

horray!

numaner

This, but not ironically.
posted by Sangermaine at 3:59 PM on February 5, 2018 [3 favorites]


They're totally downplaying the retina issue. I can imagine how these questions are hard to study. Like the cellphone vs head cancer radiation as well, wasn't there some news on that last week?
posted by polymodus at 4:02 PM on February 5, 2018


Okay, so they're called "smart glasses" and they're an IT product. That means that they are fated to being fundamentally, unavoidably (a) stupid (b) ubiquitous.

In ten years I expect to be seeing arguments presented apparently in completely good faith that one has a moral obligation to wear the glasses that force-feed us the continuous targeted advertising that supports the "free" information sources we all desperately need.

But I swear, if anybody tries to put a pair of these on my face I WILL PUNCH THEM.
posted by flabdablet at 4:08 PM on February 5, 2018 [6 favorites]


This is the future I have been excited to have since I read cstross' Accelerando. The goggles that paint your retinas with lasers, that are low-power enough to not need big bulky battery packs, that are unobtrusive.

Where can I tell them to shut up and take my money. If for nothing else, this will make giving conference talks easier, as I can get my speaker notes straight into my eye.

The first few generations of this will be interesting, but it's going to be the smart city stuff that makes them really transformative, when cities are instrumented with lots of GPS-sited nodes for referencing your position, and things have caught up to provide good rotation tracking for "where are you really looking". AR without needing to have cameras or other Google Glass awfulness problems.

Or AR that knows I'm looking at my phone, and can draw extra UI based on that.
posted by aurynn at 4:16 PM on February 5, 2018 [5 favorites]


But I swear, if anybody tries to put a pair of these on my face I WILL PUNCH THEM.

I spent my childhood stuck with glasses and they are a MASSIVE pain in the arse. Every time I got on a bus in the winter I had to take them off and so stumble blindly to my seat and hope it actually was empty, and not already occupied with some vaguely seat coloured person. Same thing when going into any indoor space from outside - instant fogged up, blind as a bat fumbling.

Contact lenses at 18 were SO BLOODY FREEING that I am convinced that anyone trying to sell these has never actually *had* to wear glasses themselves. Mind you, I find the smart watches baffling and pointless, so maybe I really AM getting old.
posted by Brockles at 4:24 PM on February 5, 2018 [6 favorites]


It says it doesn't matter how bad your vision is. I wonder if there will be versions that do have cameras, that enable people with limited vision to see.
posted by aniola at 4:26 PM on February 5, 2018 [2 favorites]


Since these are monochromatic lasers and limited in display area, this specific version probably won't help much for vision reproduction, but certainly there are areas for future innovation.

In addition to the GPS-siting and location-specific information noted above, we're going to need much more robust anonymization and data protection laws to keep such things from being abused. Europe already has such laws in place, but this will be very difficult in the US.
posted by Existential Dread at 4:32 PM on February 5, 2018


Michael Abrash makes a lot of predictions about AR, describes the state of the art, and the problems to be overcome in this video.
posted by jeffamaphone at 4:34 PM on February 5, 2018


The business case was a bit dismal. 'Data is the new oil' and yelp reviews. Can't they just say 'because it's frikkin cool'?
posted by adept256 at 4:36 PM on February 5, 2018 [2 favorites]


Fortunately, these have the "display" in the right eye, so I could wear them if I could get prescription lenses in them.

But if the display was in my left eye, I'd be fucked. I basically don't see out of my left eye. It works, but my eyes are misaligned, so I have no depth perception, and my brain basically ignores most of the signal in my left eye. VR and stereoscopic 3D don't work for me.

Every time I see something like this, I have to wonder how many of the people working on these products think about even minor visual disabilities and how their products work with them. I typically think the answer is "not at all."
posted by SansPoint at 4:40 PM on February 5, 2018 [9 favorites]


So: the vision researcher part of me really wants one of these as an experimental platform, after I can run a very extensive tech sheet past someone who understands the dangers more thoroughly than I do. There are a lot of interesting basic vision science questions we can answer with a rig like this (particularly if there's enough eyetracking onboard to do AR-type applications), but yeah, I'd want to make damn sure it's safe first. And given what I actually study these days (human visual perception in the context of self-driving cars), I'm betting on a certain lack of FDA oversight here.

I also think SansPoint has a damn good point: are these remotely usable by everyone? The visual system can vary in a huge number of ways, and if the engineers have only ever tested it on normal observers, with roughly normal color vision, and normal stereo vision, they're going to slam in to a whole lot of variability when it becomes a real product. For that matter: how does it interact with aging visual systems? The old ophthalmologist's joke is "may you grow old enough to develop cataracts," and that's far from the only change that goes on as our visual systems age.
posted by Making You Bored For Science at 5:04 PM on February 5, 2018 [2 favorites]


By shining a low-powered laser into your retina...

You’ll put an eye out!
posted by octobersurprise at 5:05 PM on February 5, 2018 [2 favorites]


>> By shining a low-powered laser into your retina...
> NopeNopeNope...


perfectly saf... oh
user advised study lead, that she experienced discomfort in her eye and said she was able to see the laser flash at several points during the study.... Employee reported eye pain after working with new prototype, thought it may be associated with use.
posted by ASCII Costanza head at 5:06 PM on February 5, 2018


user advised study lead, that she experienced discomfort in her eye and said she was able to see the laser flash at several points during the study.... Employee reported eye pain after working with new prototype, thought it may be associated with use.

To be clear, this is from some Apple documents about internal Apple prototypes and has nothing obvious to do with the product in the OP.
posted by value of information at 5:11 PM on February 5, 2018


By shining a low-powered laser into your retina...

NopeNopeNope...
posted by Thorzdad


Damn right. Gigawatt X-Ray or get out!
posted by Splunge at 5:12 PM on February 5, 2018 [2 favorites]


Unpopular opinion: If they make these safe and pain free, fuck yeah, I’d wear them.

Unhackable, too. I mean, I don’t want some ass-clown to be able to shoot frickin laser beams in my eye.

So... I’m probably never wearing these.
posted by greermahoney at 5:13 PM on February 5, 2018 [3 favorites]


I work in eye imaging, so basically spend my entire life scanning lasers on the retina. There is a rigorous standard called ANSI Z136 that determines how much power we can put on the retina.

For a 1° (288 µm) patch of scanned light, we can put 500 µW of 770 nm light on the retina. Now, 770 nm is quite visible by the red-sensing cones, even though it's well outside their nominal sensitivity range. Like, it's moderately uncomfortable to look at that power of light. And yet, it's 10 times under the damage threshold for an 8 hour exposure.

So if you go down to the visible/red (~630 nm), your eyes are dramatically more sensitive. You can easily make out a 5 µW laser, which is approximately 100x lower than the ANSI limit there. I honestly doubt the laser could reach intensities that high, and even if it did it wouldn't be sustained.

In short, this is very, very safe.
posted by Maecenas at 5:41 PM on February 5, 2018 [51 favorites]


By shining a low-powered laser into your retina

Two words,

Hacked

No
posted by Beholder at 5:48 PM on February 5, 2018


By shining a low-powered laser into your retina...
The googles, they do ... something!
posted by pulposus at 5:48 PM on February 5, 2018 [2 favorites]


Hell, I just had lasers shot at my eyes so I wouldn’t have to wear glasses.
posted by Huffy Puffy at 5:53 PM on February 5, 2018 [9 favorites]


It's not a trailer for a remake of The Jerk?
posted by peeedro at 5:58 PM on February 5, 2018 [2 favorites]


By shining a low-powered laser into your retina...

No, no, no, you've got it backward. I want lasers shining OUT FROM my retina, thank you very much.
posted by hexaflexagon at 6:08 PM on February 5, 2018 [27 favorites]


I have major trust issues when it comes to wanting to shoot lasers into my eyes for fun and profit. But I can think of so many use cases for this for my job alone. I'm imaging having api references, system information, process information, hell, give me a terminal! And the ability to squint and drag text boxes around!

But as soon as I leave work, I don't want these things anywhere near me.

posted by Cat Pie Hurts at 6:10 PM on February 5, 2018 [1 favorite]


LASERS!! LASERS ON THE inSIDE OF MY RETINA!
posted by sexyrobot at 6:25 PM on February 5, 2018 [1 favorite]


I like how in the interview he says, "it's not just going to show my twitter mentions,right? Cause that would be really annoying," and then the Intel guy says, "no, no, no. Instead, it will show you Yelp reviews whenever you're near a restaurant."
posted by whir at 6:26 PM on February 5, 2018 [13 favorites]


And you can answer your eyeball text message by moving your eyeballs. Left, right, up, up, left. Next letter...

Hope it has T9.
posted by ctmf at 6:38 PM on February 5, 2018 [2 favorites]


I like how in the interview he says, "it's not just going to show my twitter mentions,right? Cause that would be really annoying," and then the Intel guy says, "no, no, no. Instead, it will show you Yelp reviews whenever you're near a restaurant."

AND THEY SAY ENGINEERS HAVE NO SENSE OF HUMOR!

Full disclosure: I am a software "engineer"
posted by Slothrup at 6:50 PM on February 5, 2018 [4 favorites]


In short, this is very, very safe.

They said that about Facebook and look where we are now
posted by delfin at 6:54 PM on February 5, 2018 [6 favorites]


I would want some assurances that when it crashes, the laser won't just bore a hole through whatever bit of retina it was most recently pointed at.
posted by runehog at 6:56 PM on February 5, 2018


You could take them off.
posted by jeffamaphone at 6:58 PM on February 5, 2018 [1 favorite]


This is impressive coming from lumbering Intel. I would've thought the other fancy tech companies would've come up with something like this.

Not to diminish Sanspoint's issue but I think it's unfair to expect engineers to be mindful of people with any impairments when they are out building bleeding edge stuff like this.No one built any device to be universally useful on the first go. Iterations y'all..
posted by savitarka at 7:32 PM on February 5, 2018 [1 favorite]


This is the future I have been excited to have since I read cstross' Accelerando. The goggles that paint your retinas with lasers, that are low-power enough to not need big bulky battery packs, that are unobtrusive.

Where can I tell them to shut up and take my money. If for nothing else, this will make giving conference talks easier, as I can get my speaker notes straight into my eye.


So much of science fiction imagined things like Alexa except they were talking computers/robots/self-driving cars that belonged to you and were designed to be good tools for the things you wanted to do.

But by the time we got to imagining lasers painting images on eyes to alter how you see the world around you, SF seems to have mostly realized that we're not gonna be the owners of these devices, nor their masters.
posted by straight at 7:47 PM on February 5, 2018 [4 favorites]


As with Google Glass, I can imagine certain limited applications for them, all pretty nerdy (for example a heads-up GPS so I can keep my eyes on the road), but my educated guess is that I won't be even slightly interested in 98% of whatever their touted advantages/selling points end up being.
posted by Greg_Ace at 8:08 PM on February 5, 2018 [2 favorites]


(for example a heads-up GPS so I can keep my eyes on the road)

I'm pretty interested in technical information around this actually.. Like, there is a reason we don't already see HUDs in all cars everywhere, and it isn't cost.

I think it's because of focal distance. I think that in a plane, 99.9% of everything you'll ever look at out the window is focused at infinity. One of the dudes in the video talks about this intel thing being focused at infinity too. But in cars, you don't only focus at infinity, you are actually focusing in and out constantly. Refocusing isn't much easier than just looking down at the console, so cars simply don't benefit from HUDs.

Again, I'm very interested in technical discussion of this issue. I'm not that up on optics, and I'd love to learn more about the details I'm no doubt missing. As it stands though, it seems to me it is a HUGE problem with all notions of augmented reality. Yet somehow people aren't talking about it much?
posted by Chuckles at 8:51 PM on February 5, 2018 [4 favorites]


> Splunge:
"By shining a low-powered laser into your retina...

NopeNopeNope...
, Thorzdad

Damn right. Gigawatt X-Ray or get out!"


Only if I can flip the beam direction. Fuck IGNORING people. Start IGNITING PEOPLE!
posted by Samizdata at 9:53 PM on February 5, 2018 [3 favorites]


Look, I about had to...head to my bunk...when I saw these. I have been in glasses for 42 years, and I AM BLOODY SICK OF THEM JUST SITTING THERE ON MY GODDAMN NOSE DOING NOTHING BUT FOCUSSING ALL THE GODDAMN TIME!

Yes, I want the GPS. I want to be able to stream the book of the moment to it with a hand stealthily clicking the volume bar. I want Google Play Music to tell me what song I am streaming at work so I don't have to stop work, yank the tablet out of my back pocket, and wake up the tablet to figure out what I am listening to. (I also want the thumbs down option on the notification bar so I don't have to unlock the device to ditch a crappy song, but that's another thread.) I want my list of tasks displayed so I don't embarrass myself by checking a notebook and demonstrating my memory sucks. I want my appointment notifications to pop up instead of relying on a sound I might not hear.

I WANT ROOM SERVICE!

Ahem. Sorry.
posted by Samizdata at 9:59 PM on February 5, 2018 [7 favorites]


(Also, people, you are paying not enough attention to the device specs in your attempts at alarmist humor. Even if shit went REALLY pear-shaped, that tiny battery won't feed the laser enough power to really burn through anything AND that's assuming it just doesn't punch through the lens and grating. You lose a fair bit of retina-scorching power with the reflection you know (especially if it ends up as fingerprinted as my glasses do.))
posted by Samizdata at 10:02 PM on February 5, 2018 [4 favorites]


perhaps for some context, i guess?
-Google Glass 2.0 Is a Startling Second Act
-We Need to Talk About Magic Leap's Freaking Goggles
-Measuring the Effective Resolution of Head-mounted Displays

(finally keep up with megathreads! obvs ;)
posted by kliuless at 10:04 PM on February 5, 2018 [2 favorites]


also, the jerk
posted by kliuless at 10:10 PM on February 5, 2018


This would be enormously useful to physicians/healthcare providers. I actually considered getting the google glasses. My success in the office depends on being able to discreetly pull up bits of information, either from my brain, or my smart phone, or my computer to fill in the blanks on some vaguely recalled knowledge quickly on the fly, while remaining engaged with the person I’m talking to. It used to be that there was a finite amount of information that one could mostly master but the data we sift through now increases exponentially every year. There’s also the possibility that an embedded camera and microphone could provide a totally unobtrusive tool for documentation.

One could also naively imagine a non-evil use of this for law enforcement. If it were possible to be that naive.
posted by Slarty Bartfast at 10:21 PM on February 5, 2018


yank the tablet out of my back pocket

Where do you find pants with pockets that big??
posted by Greg_Ace at 11:23 PM on February 5, 2018


> Greg_Ace:
"yank the tablet out of my back pocket

Where do you find pants with pockets that big??"


Standard Levis? I only carry an 8 inch. That's enough for me to read, do some browsing and Bedflix. Plus it fits my broad hands and stubby fingers. I used to rock a 10 incher, but dropped it because it was awkward to carry.
posted by Samizdata at 12:33 AM on February 6, 2018


> kliuless:
"also, the jerk"

Damn it, now I have to watch it again, damn your non-strabismic eyes!
posted by Samizdata at 12:36 AM on February 6, 2018


My success in the office depends on being able to discreetly pull up bits of information

Which gets to perhaps the biggest obstacle these devices face: how exactly are you supposed to interact with them? Consider the possible input methods:

voice: highly conspicuous, slow
midair hand gestures: conspicuous, inaccurate, imprecise
frame touchpad: imprecise, limited expressiveness
gaze: slow, intrusive, frustrating
phone touchpad: obviates the purpose of the glasses in the first place

Here it's also interesting to look at how interfaces have developed for VR, because VR interfaces tend to place UI elements on the hands or floating in the world rather than having view-space HUDs, even though from a technical perspective a HUD is easier to implement. And it's easy to see why: having a UI fixed in relation to your view is really intrusive and difficult to interact with.
posted by Pyry at 1:38 AM on February 6, 2018 [1 favorite]


I think it's because of focal distance... (and etc posted by chuckles)

Heyo...I did take some optics in college. I'm pretty sure that if you're writing directly onto the retina with a laser then it's not possible to go out of focus because the 'screen' only exists inside of the eye. I'm not even sure it would need an adjustment knob for different laser-to-retina distance as any difference in distance would just make the screen larger or smaller.
posted by sexyrobot at 1:47 AM on February 6, 2018


> Pyry:
"My success in the office depends on being able to discreetly pull up bits of information

Which gets to perhaps the biggest obstacle these devices face: how exactly are you supposed to interact with them? Consider the possible input methods:

voice: highly conspicuous, slow
midair hand gestures: conspicuous, inaccurate, imprecise
frame touchpad: imprecise, limited expressiveness
gaze: slow, intrusive, frustrating
phone touchpad: obviates the purpose of the glasses in the first place

Here it's also interesting to look at how interfaces have developed for VR, because VR interfaces tend to place UI elements on the hands or floating in the world rather than having view-space HUDs, even though from a technical perspective a HUD is easier to implement. And it's easy to see why: having a UI fixed in relation to your view is really intrusive and difficult to interact with."


Well, then, perhaps another device? A simple watch with a capacitive screen to act as a touchpad?
posted by Samizdata at 2:02 AM on February 6, 2018


A ring? The possibilities are endless really.

Ear-waggling FTW
posted by merlynkline at 2:07 AM on February 6, 2018


I'm still not assuaged that these things would be almost surely safe for my eyes. But then I heard all that blue light from my phone really isn't that healthy, either.
posted by polymodus at 2:32 AM on February 6, 2018


By shining a low-powered laser into your retina...
NopeNopeNope...


1) LED's blue light damage
2) photo-biomodulation therapy

Note how one is light in the eyes and the other has light on the body. There is info about long wave IR the pro paleo people like to trot out and make comments about sitting around a fire but I'll let you go look that up yourself while you are drinking raw water.
posted by rough ashlar at 2:43 AM on February 6, 2018


> merlynkline:
"A ring? The possibilities are endless really.

Ear-waggling FTW"


Damned ablists again. We need to be able to have almost EVERYONE able to use this. Not just you elitists with your wobbly pinnae, okay?

(I shan't be able to afford this, like, ever, but I can dream.... Unless that dream involves looking like a flyblown horse.)
posted by Samizdata at 2:47 AM on February 6, 2018


Pyry: Which gets to perhaps the biggest obstacle these devices face: how exactly are you supposed to interact with them?

I remain hopeful we'll get our Vingean Rainbows End-style subvocalisation input devices that can detect unspoken speech. There are already a few proof of concept devices out there, but nothing approaching consumer-level. That would solve the 'conspicuous' part of voice interaction, and possibly the 'slow' part as well if we develop new slang.
posted by adrianhon at 3:54 AM on February 6, 2018


Bloomberg - Intel Is Said to Plan Sale of Majority Stake in AR Glasses Unit

Think of how much more efficiently something like this will work with a modern ARM core rather than an Intel CPU.
posted by acb at 4:02 AM on February 6, 2018


Measuring the Effective Resolution of Head-mounted Displays

Does it have to be a pixel grid? I want a vector display, like in those old Atari arcade machines.
posted by acb at 4:08 AM on February 6, 2018 [3 favorites]


I want one.

But then, I also saw nothing wrong with the original Google Glass.

Mostly, I just want to have a date/time stamp in my lower right field of vision. I'm a compulsive clock watcher so to me that'd be flipping awesome. If I can also get outside temperature for where I am and weather conditions, and maybe incoming text messages scrolling across that'd be a bonus.

Ultimately I'd like the whole augmented reality thing, but for now I'll settle for a small HUD.
posted by sotonohito at 6:11 AM on February 6, 2018 [2 favorites]


By shining a low-powered laser into your retina...
NopeNopeNope...

Unhackable, too. I mean, I don’t want some ass-clown to be able to shoot frickin laser beams in my eye.


I also wouldn't trust laptop screens: what if they get hacked and start frying people's faces by shooting 100 megawatts of pure, deadly white LED light at them? Or those intelligent light bulbs! What if they get hacked and start emitting X-rays!

I mean, we've been intentionally peering into cathode ray guns shooting a stream of electrons at our retinas for decades, with nothing but a thin screen of phosphorus to shield us.

Seriously though, all this stuff is just an interlude to the real deal: direct feed into the visual cortex. Now that I'd like to try (but not become dependent on).
posted by Laotic at 6:46 AM on February 6, 2018 [3 favorites]


direct feed into the visual cortex. Now that I'd like to try (but not become dependent on).

Why stop there? Why not reëngineer the human neurology to have two visual cortices: one for input from the eyes, and one for synthetic input (from information displays to video/movies)? The other one would subjectively appear like a particularly vivid and effortless form of mental visualisation.
posted by acb at 6:54 AM on February 6, 2018


I'd have some questions about how people who have migraines or other neurological quirks might be affected.

But, really, what I want is facial recognition plus a directory of known persons, so I don't have to fumble around avoiding using anyone's name in case I've assigned the wrong name to the wrong person again. (I do it with town names and road names too, but that just makes me come off as confused as opposed to someone who thinks people are interchangeable. Which I don't. It's just... proper nouns, I misfile them.)
posted by Karmakaze at 7:07 AM on February 6, 2018 [1 favorite]


Karmakaze: As someone who is also shit with remembering names, smart glasses that remind me who I'm looking at would be awesome... with a caveat: all the facial detection should be done, locally, on the glasses CPU, or on the connected phone, using a database of stored photos, and NOT mediated by some third-party social media platform. Looking at you, Facebook.
posted by SansPoint at 7:27 AM on February 6, 2018


Someone at Intel's clearly been playing too much Metal Gear.
posted by aspersioncast at 7:30 AM on February 6, 2018


Why stop there? Why not reëngineer the human neurology to have two visual cortices: one for input from the eyes, and one for synthetic input (from information displays to video/movies)? The other one would subjectively appear like a particularly vivid and effortless form of mental visualisation.

I think the brain is plastic enough to figure on its own how to deal with the feed. My only big worry is that in case of an EMP lots of people would literally forget how to breathe, but I think the risk is there even now, should facebook and the rest go down.
posted by Laotic at 7:32 AM on February 6, 2018 [1 favorite]


Contact lenses at 18 were SO BLOODY FREEING that I am convinced that anyone trying to sell these has never actually *had* to wear glasses themselves.
I guess these things on my face aren't really glasses and the last 20 years have been a fever dream, then, because I flat out refuse to get contact lenses.
posted by inconstant at 8:02 AM on February 6, 2018 [3 favorites]


This is impressive coming from lumbering Intel

Hard to imagine the CIA/NSA isn't all up in this. No thank you.
posted by maniabug at 8:03 AM on February 6, 2018


It is not weird to point out that a lot of non glasses wearing people flat out will not want to wear glasses, regardless of how many Yelp reviews they could see.

Honestly, I don't really understand the benefit of this at all. It seems like one more barrier to human interaction.
posted by agregoli at 8:07 AM on February 6, 2018 [1 favorite]


When thinking about what Intel has shown here, I would suggest keeping in mind this is a prototype. There are still significant challenges all around to building all-day wearable AR glasses. There are tons of variables to get the full fidelity experience. Intel has, with this prototype, thrown out a lot of those variables and focused on form factor and what they could get into that form factor with today's technology. They've solved some power and weight issues, and focused on making something socially acceptable. They've completely ignored all the other hard problems.

Another way to think about it:

Intel's Vaunt : Something you'd actually want to own :: Virtual Boy : Oculus Rift
posted by jeffamaphone at 9:14 AM on February 6, 2018


Now how would I fit these on my shark? (I asked for ONE thing, people...)
posted by kleinsteradikaleminderheit at 9:25 AM on February 6, 2018


Contact lenses at 18 were SO BLOODY FREEING that I am convinced that anyone trying to sell these has never actually *had* to wear glasses themselves.

Counterpoint: I tried contacts once several ago. It was an ordeal. It took approximately 70 minutes, during which I could not stop squeezing my eyes shut and crying like there was something seriously wrong with me (uh yeah, it was depression among other things), to get the goddamned thing back out of my eye. I will never try it again. I'll wear the glasses. Which I look better with anyway.
posted by Foosnark at 10:11 AM on February 6, 2018 [1 favorite]


modern ARM core rather than an Intel CPU

You are aware that Intel is now an ARM licensee?
posted by jkaczor at 10:28 AM on February 6, 2018


I can't wiggle my ears, but I do have voluntary control over whatever it is that opens my eustachian tubes (I can make them click) ... surely that's good for some sort of input device, friends who are divers tell me it can be learned
posted by mbo at 10:47 AM on February 6, 2018


I'm pretty sure that if you're writing directly onto the retina with a laser then it's not possible to go out of focus

That thought did occur to me, but as I said, the dude from Intel says the image is focused at infinity.
posted by Chuckles at 12:18 PM on February 6, 2018


Every time I see something like this, I have to wonder how many of the people working on these products think about even minor visual disabilities and how their products work with them. I typically think the answer is "not at all."

I really don't understand comments like this. We're talking about a bleeding-edge product that just barely does anything useful for people with perfect vision in ideal circumstances. Of course it's not going to work well for all people. Using that as evidence that the engineers behind it don't care enough about disabled people is just empty cynicism; it's about as helpful as calling Gutenberg a jerk because his printing press couldn't produce Braille.

Believe it or not, people who make technology are real live human beings. Most of us know and care about at least a few people with disabilities because they are our parents or other relatives. And those of us who work on products for end users absolutely do put thought into how we can make our products accessible to people with disabilities.
posted by shponglespore at 4:43 PM on February 6, 2018 [6 favorites]


> Karmakaze:
"I'd have some questions about how people who have migraines or other neurological quirks might be affected.

But, really, what I want is facial recognition plus a directory of known persons, so I don't have to fumble around avoiding using anyone's name in case I've assigned the wrong name to the wrong person again. (I do it with town names and road names too, but that just makes me come off as confused as opposed to someone who thinks people are interchangeable. Which I don't. It's just... proper nouns, I misfile them.)"


This too. Especially if I can tag them with quick notes and maybe a precis from their social media...
posted by Samizdata at 6:45 PM on February 6, 2018


> inconstant:
"Contact lenses at 18 were SO BLOODY FREEING that I am convinced that anyone trying to sell these has never actually *had* to wear glasses themselves. I guess these things on my face aren't really glasses and the last 20 years have been a fever dream, then, because I flat out refuse to get contact lenses."

I had contact lenses when I was, erm, 16ish? Between them being really new to consumers and the documentation having an error in it leading to corneal chemical burns, and the fact my eyes are really a little too dry to accommodate contacts comfortably (ask me how many teachers in high school were convinced I was stoned constantly!) I am not really up to trying again.

Although, come to think of it, we could skip ALL this crap if they would just make some decent cybereyes...
posted by Samizdata at 6:48 PM on February 6, 2018


You are aware that Intel is now an ARM licensee?

That's just for fabbing chips for third parties, and/or auxiliary processors, from what I understand, and doesn't touch their hero products. I suspect ARM would be too NIH for their flagship items, resulting in all the conspicuously sexy-looking stuff (future glasses, phones/tablets sold at a loss with big “Intel Inside” logos on the back, Raspberry Pi-alikes only $100 more expensive, &c.) being saddled with lumbering, power-guzzling IA64 CPUs.
posted by acb at 2:15 AM on February 7, 2018


[...] my educated guess is that I won't be even slightly interested in 98% of whatever their touted advantages/selling points end up being.
You could make x-ray specs that really work ... in the sense that they overlay people with personalized porn. I bet that app would be a mover.
posted by Gilgamesh's Chauffeur at 4:05 PM on February 7, 2018


« Older Couch to 80k words in 10 minutes of writing per...   |   the most 90s thing to come out of 2018 Newer »


This thread has been archived and is closed to new comments