Join 3,433 readers in helping fund MetaFilter (Hide)


"Help me out White Wanda."
December 20, 2009 11:07 PM   Subscribe

"Hewlett Packard computers are racist." [SLYT]

So far, this guy seems to be taking it fairly well. Although he's spoiled the Christmas surprise.
posted by sharpener (100 comments total) 34 users marked this as a favorite

 
I was all prepared to hate on this, but that's pretty fucking funny.
posted by empath at 11:11 PM on December 20, 2009 [9 favorites]


Okay, see, that just made me laugh. It's a contrast issue, right? Please tell me it's a contrast issue.
posted by anitanita at 11:11 PM on December 20, 2009


Hp actually commented on the video with a link to their response.
posted by empath at 11:15 PM on December 20, 2009 [3 favorites]


I was about to delete this thinking it'd be really lame, but that shit was funny (and sad when you think about the engineering that went into it and not a single person on the team thought to maybe try it with a variety of people)
posted by mathowie at 11:17 PM on December 20, 2009 [5 favorites]


Thanks for the link, empath. I figured it was something like that, but I was struck with how well Desi was taking it. Pretty funny.
posted by sharpener at 11:20 PM on December 20, 2009 [1 favorite]


The litmus test would be moving a photo of Obama in front of the Windows box.
posted by Blazecock Pileon at 11:23 PM on December 20, 2009 [5 favorites]


I was struck with how well Desi was taking it.

Um... Why would he take it "badly"? I mean, I doubt that any rational human being would think there's a line of code in HP's software that says, "Ignore all black people!"
posted by Saxon Kane at 11:27 PM on December 20, 2009 [8 favorites]


The camera is probably just drunk and straight male. Under those conditions visual tracking is reserved for females.
posted by twoleftfeet at 11:27 PM on December 20, 2009 [3 favorites]


Maybe this is a strange thing to comment on, but another pleasant surprise from this post (the first being that I actually really enjoyed that) is that the YouTube comments on this video are kind of like Metafilter populated by 14-year olds instead of the usual fare. Especially surprising given the open avenue for the usual slurs.

Oh, and my take on the video itself: LOL!!! It actualy only wokrs on white people! Thx 4 this sharpener. Desi is a funny dude :) :) :)
posted by battlebison at 11:30 PM on December 20, 2009


We believe that the camera might have difficulty “seeing” contrast in conditions where there is insufficient foreground lighting.

Holy shit, it really doesn't see black people.

You mean there's a ... Black Wanda?
posted by dirigibleman at 11:36 PM on December 20, 2009 [3 favorites]


Um... Why would he take it "badly"? I mean, I doubt that any rational human being would think there's a line of code in HP's software that says, "Ignore all black people!"

In my mind, the development work is done in the United States, and the QA testing is done in India, and QA keeps filing bugs that the face tracking software doesn't work, and development engineers keep closing the bugs unfixed with the resolution "WORKS FOR ME".
posted by davejay at 11:44 PM on December 20, 2009 [73 favorites]


Funny, but I'm going to guess that the camera would work fine under better conditions. The bright retail lights behind him are causing the camera's aperture contract, making it hard to represent a dark foreground object. The reflection of the lighting off his forehead is the lightest area on his face instead of the whites of his eyes, which is probably upsetting the face detection described on the HP blog.
posted by gngstrMNKY at 11:47 PM on December 20, 2009


Um... Why would he take it "badly"? I mean, I doubt that any rational human being would think there's a line of code in HP's software that says, "Ignore all black people!"

Of course. But when I originally found this, I was expecting OMG RACISM OUTRAGE and was surprised to see Desi and Wanda making a joke of an apparent engineering blunder. It was awesome. Look at the first few comments here: we were prepared to "hate" it and "delete" it. When you read the link it seems like someone's going to get all up-in-arms about it. When you watch the video you see people taking it in stride and looking for answers. Much nicer than a knee-jerk reaction.
posted by sharpener at 11:48 PM on December 20, 2009 [3 favorites]


To me, there's also a bit of this going on. I mean it's kinda crazy that computers can recognize faces at all, isn't it?
posted by empath at 11:52 PM on December 20, 2009 [4 favorites]


Thanks for the link, empath. I figured it was something like that, but I was struck with how well Desi was taking it. Pretty funny.
Um... Why would he take it "badly"? I mean, I doubt that any rational human being would think there's a line of code in HP's software that says, "Ignore all black people!"
Yeah, I'm a little confused about why anyone would expect him to take it 'badly'.
In my mind, the development work is done in the United States, and the QA testing is done in India, and QA keeps filing bugs that the face tracking software doesn't work, and development engineers keep closing the bugs unfixed with the resolution "WORKS FOR ME".
You can't really 'test' complicated machine learning based software like this, there's no way to test every possible input. The software is no doubt based on the same kind of face recognition software that was originally designed at a university somewhere. Integrators just take the standard library and the only testing would be to check to see if everything was hooked up properly -- that the image was getting to the software and that the panning/zooming was being controlled properly.

The problems would go all the way back to the original researchers, if they didn't use enough examples of dark faces with light background, specular highlights, etc.
posted by delmoi at 12:03 AM on December 21, 2009


I love this. People being funny and human...at work. I guess I like watching people play. And I'm getting really tired of serious outrage filter all the time. This made me laugh.

It's also neat because it lets you in these two people's lives. You can imagine what the conversation was like before they recorded the video. And what it was like when Desi noticed this thing about the tracking in the first place. He probably pulled Wanda over and was all, "hey, check this out..." and then they killed 15 minutes talking and laughing about it. And decided to record. Then after, I'm sure they talked about it some more. Showed some coworkers, went back to work, whatever. You can also imagine a range of possibilities later. Are Desi and Wanda checking the YouTube watched video count? Has Desi's wife seen the video yet? Maybe he gave her the Christmas present early, and they've been messing with the tracking feature for hours now. And what's Wanda doing? Has she been reading all the comments? What about their boss? Friends? Coworkers? And meanwhile some folks over at HP are scrambling. Legal has been called. And R&D. Will a disclaimer go out? Some PR person contacted Desi and Wanda yet?

I could go on forever, but everything about this has me in giggle fits. I'm imagining everything from them watching the video again with their friends, to HP getting in an expected corporate frenzy over this, and I'm just laughing and I can't stop.
posted by iamkimiam at 12:08 AM on December 21, 2009 [31 favorites]


Sorry for the phrase "how well Desi was taking it" everybody. I really didn't mean it as though Desi was a nitwit who wouldn't be able to figure out that HP computers probably weren't ACTUALLY racist. It was more a comment on his delivery than anything else. I will now allow this thread to run it's course.
posted by sharpener at 12:11 AM on December 21, 2009


Or what iamkimiam said. Cheers.
posted by sharpener at 12:12 AM on December 21, 2009


Turns out that component is actually manufactured by Veridian Dynamics
posted by qvantamon at 12:14 AM on December 21, 2009 [14 favorites]


Wow it's life imitating art. On an episode of Better off Ted the building security system failed to recognize black people (because their skin was too dark), hilarity ensued.
posted by selenized at 12:15 AM on December 21, 2009 [2 favorites]


A telling example of why one needs to use a REPRESENTATIVE sample in testing!
posted by fallingbadgers at 12:16 AM on December 21, 2009


when i was 19 i got a job at hewlett packard in vancouver, washington. the factory that i worked in made printers. i don't remember the model numbers. we did a bunch and that was like 13 or 14 years ago. there was one black dude on my production line, greg, super funny dude. obsessed with his fantasy baseball league, cracked lots of jokes. he helped make mind-numbingly boring shifts ok sometimes. there was a smallish run of printers that we did for a while that had a black shell as opposed to the very light gray or whatever the standard was. when we started doing those black printers, greg immediately code-named them "Afro-500's". He also made lots of cracks along the line that as soon as someone bought one of these Afro-500's they'd be so lazy they'd never work for the consumer, and that they were gonna be a huge detriment to HP. funny stuff.
posted by rainperimeter at 12:18 AM on December 21, 2009 [1 favorite]


Um... Why would he take it "badly"? I mean, I doubt that any rational human being would think there's a line of code in HP's software that says, "Ignore all black people!"

It is unlikely that the code is explicitly racist, but it was probably developed and tested by light-skinned Caucasians and Asians. Racist by omission, rather than commission. The real mystery is why there weren't any Dravidians on the QA team.
posted by b1tr0t at 12:23 AM on December 21, 2009


I've only seen Desi and Wanda for 2 minutes, 14 seconds, but MAN, I really like them.
posted by letitrain at 12:32 AM on December 21, 2009 [15 favorites]


This just pisses me off. Because like so many things lately... if you would stop laughing at the system and set it up correctly, you could stop fucking laughing!
posted by autodidact at 12:43 AM on December 21, 2009 [1 favorite]


It seems like as good a metaphor for systemic racism as you're going to find. It's the story of a whole bunch of people who didn't set out to make anything racist (in fact, a bunch of people who'd do just about anything not to get dragged into a discussion of race), who still managed to make a product that's manifestly discriminatory.

I'm sure that it's not that it was never tested on different skin tones. But it does seem clear that it wasn't tested on all skin tones in all lighting conditions. That's the way bad software QA tends to work: you test rigourously for the cases you're worried about, and perfunctorily for the cases that instinctively seem unlikely. Assumptions and quality control - BFFs 4EVER! Next thing you know, Roger Ebert is sitting in Chicago issuing tweets about how your computers are racist.

When things go wrong, and you go back and look at what broke, the first thing that you notice are the faulty assumptions that led to the problem. I wonder which assumptions the team at HP is revisiting tonight?
posted by bicyclefish at 12:58 AM on December 21, 2009 [15 favorites]


When things go wrong, and you go back and look at what broke, the first thing that you notice are the faulty assumptions that led to the problem. I wonder which assumptions the team at HP is revisiting tonight?

Well, it's just an off the shelf algorithm, no one at HP wrote this.
posted by delmoi at 1:08 AM on December 21, 2009


It seems like as good a metaphor for systemic racism as you're going to find.

Ralph Ellison 2.0.
posted by Blazecock Pileon at 1:14 AM on December 21, 2009 [1 favorite]


This just pisses me off. Because like so many things lately... if you would stop laughing at the system and set it up correctly, you could stop fucking laughing!

I didn't interpret this as some fault in the system. It's just a software bug that someone made light of! (Pun intended) I actually find this post refreshing for being a very simple post that is truly best of the web and not outrage/news/politics-filter.

I guess it's all in how you interpret things, but I feel you should laugh with them instead of being mad for them.
posted by battlebison at 1:14 AM on December 21, 2009


I think I'm beginning to see where their test department went wrong.
posted by pracowity at 1:29 AM on December 21, 2009 [6 favorites]


I wonder how it handles bald people against various backgrounds, or people with hair colour very close in shade to their skin colour (very dark Africans or Papuans), beards, or eyepatches.

Hmm. This probably means it's prejudiced against PIRATES!

(No great surprise there, for a HP machine ...)
posted by aeschenkarnos at 1:34 AM on December 21, 2009


This reminded me of why I can't use speech recognition software.

(At least, not as of the last time I tried it ...)

I speak fluent English, but I have a BBC presenter's accent. To English ears, that's pretty neutral. But just about all commercial speech recognition software is written for an American drawl, and I just can't do one to save my life. Result: accuracy drops from around 95-99% to 50%. All because the developers ignored accents used by around 30-40% of English speakers worldwide, and focussed on English as she is spoke in the USA.

The pharmaceutical industry hit this one in the 1970s-80s, too, and still haven't gotten over it. Most of your medicines get a product license after human clinical trials that exclude children, geriatrics, and pregnant women -- even though all of these groups metabolize drugs differently from the "default" group of healthy adult men (who make up maybe 20% of the population).

This isn't even a matter of privilege; it's just about not testing your product adequately and being bitten on the arse when its defects meet the real world.
posted by cstross at 1:35 AM on December 21, 2009 [6 favorites]


But just about all commercial speech recognition software is written for an American drawl,
Which American drawl, though? US accents vary quite a bit, especially North/South but still noticeably East/West.
posted by aeschenkarnos at 1:43 AM on December 21, 2009 [1 favorite]


But just about all commercial speech recognition software is written for an American drawl, and I just can't do one to save my life. Result: accuracy drops from around 95-99% to 50%. All because the developers ignored accents used by around 30-40% of English speakers worldwide, and focussed on English as she is spoke in the USA.

What the heck are you talking about? That's not even how speech recognition systems work! And where are you getting these stats?
posted by iamkimiam at 1:50 AM on December 21, 2009 [1 favorite]


me, i wanna know what the user centered design crowd is gonna say, heck, I wanna know what would the norman/neilson gang say? ...

design for the next billion customers, indeed
posted by infini at 2:09 AM on December 21, 2009


That made me laugh, and it's good that HP, Youtube commentators and MeFi people are taking it for the joke it is. Like empath, I went into this thinking I was going to hate it, and came out pleasantly amused.
posted by seanyboy at 2:21 AM on December 21, 2009


Maybe if there was some way to use makeup to inrease contrast then... oh wait! I just invented a new PR disaster!
posted by Artw at 2:22 AM on December 21, 2009 [3 favorites]


The first thing I noticed when the video started is that the lighting is terrible in that shot. I'm not surprised the software is having trouble recognizing a face in that murky backlit available light, and it's still only barely managing to get a fix on his more reflective friend. I bet installing a $7 Ikea floor lamp behind the camera would make it work perfectly.
posted by w0mbat at 2:37 AM on December 21, 2009


Nah, the IKEA lamps won't work. However, I'm pretty sure the lamp you need is sold right there in the store, over on aisle 7. HP approved and everything!
posted by iamkimiam at 2:55 AM on December 21, 2009


Ikea lamps are kind of racist. It had to be said.
posted by damehex at 3:05 AM on December 21, 2009 [1 favorite]


Maybe if there was some way to use makeup to inrease contrast then... oh wait! I just invented a new PR disaster!

Or possibly some sort of white mask or hood... oh hang on...
posted by fearfulsymmetry at 3:28 AM on December 21, 2009


Hee! I giggled a lot at this. Thanks for sharing.
posted by NikitaNikita at 4:42 AM on December 21, 2009


The best part is how his *entering* the frame negates tracking her. That's the extra bizarreness to me.

Also, Desi and Wanda need to be in a mac commercial or something.
posted by NikitaNikita at 4:45 AM on December 21, 2009 [8 favorites]


Desi has a great voice.

I was going to say it must be a contrast issue, but somehow I suspect it works fine with blonde people. It would be interesting to try to debug exactly what is going on here. It looked like it snapped onto him at least once, but then it doesn't track. So maybe it recognizes him as a face but can't tell when he moves?
posted by DU at 4:52 AM on December 21, 2009




The best part is how his *entering* the frame negates tracking her. That's the extra bizarreness to me.

Or as Desi puts it, "as soon as my blackness enters the frame...."

And that pretty much sums up what's probably happening here. You can tell as he moves in and out that the computer is compensating, exposure-wise, for the "blackness" in the frame so it does "see" him--just not as a face.
posted by availablelight at 4:55 AM on December 21, 2009


I bet it's those overhead track lights behind their heads. There's one row right about at their eye level. With Desi in-frame, the track lights are the brightest things, so the software locks onto those; with Wanda in-frame, her skin reflects enough light that it's able to track her.

This is a guess, but I'm thinking that the face recognition part works on outlines, and then the tracking part works on highlights. When Desi pokes his head into frame, the camera decides that his darker outline is 'the face', but then the highlights make no sense, and it stops tracking properly.

As others have said, with a desk lamp instead, it would probably work fine. I suspect Mrs. Desi won't be upset on Christmas. :)
posted by Malor at 5:25 AM on December 21, 2009


I want Desi to be our second black president!!!!
posted by The Deej at 5:28 AM on December 21, 2009


The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose. We believe that the camera might have difficulty “seeing” contrast in conditions where there is insufficient foreground lighting.

Racists have been using this tired excuse for decades now.
posted by Marisa Stole the Precious Thing at 5:49 AM on December 21, 2009 [27 favorites]


The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose. We believe that the camera might have difficulty “seeing” contrast in conditions where there is insufficient foreground lighting.

I saw this in the official HP response as well. Will someone who knows more about the technology behind this explain why there wouldn't be higher contrast present (i.e. difference between eyes and face) with a person of color?

Also, I thought it was interesting that even when his face was close up to the monitor, and thus properly exposed with plenty of detail, the tracking function still didn't work.
posted by availablelight at 5:55 AM on December 21, 2009 [1 favorite]


THAT'S RACIST.
posted by Eideteker at 6:01 AM on December 21, 2009


there's something i'm not getting - why is all the print backwards as if the camera was some kind of mirror?
posted by pyramid termite at 6:17 AM on December 21, 2009


I really liked Wanda and Desi. They looked like they had a good time making fun of something weird they found out by accident. I bet they are really, really surprised at how popular the video has become. Or what iamkimiam said, I guess.
posted by gemmy at 6:25 AM on December 21, 2009


Will someone who knows more about the technology behind this explain why there wouldn't be higher contrast present (i.e. difference between eyes and face) with a person of color?

Watch the video. Because of the light placement, Desi's eye whites are in shadow while his cheeks and nose have an almost specular reflection. This puts them pretty close to each other.
posted by DU at 6:30 AM on December 21, 2009


My Packard Bell burned a cross on my lawn. Now that is racist.
posted by KevinSkomsvold at 6:57 AM on December 21, 2009 [7 favorites]


My research group did a bunch of work with facial tracking. It's usually done in monochrome instead of color, in the infrared band, as IR tends to be sensitive to the texture of skin rather than its brightness. For this, there needs to be IR in the room. We used IR emitters, but HP may be assume the presence of incandescent illumination. The fluorescents in that room are going to putting out very little IR.

Just a theory.
posted by rlk at 7:01 AM on December 21, 2009 [3 favorites]


there's something i'm not getting - why is all the print backwards as if the camera was some kind of mirror?

It's exactly for that reason. When the computer displays your video, you expect it to act like a mirror, so the video is flipped. Macs behave the same way with the "Photobooth" software. Perhaps they should have flipped it in post-production, or maybe it doesn't matter that much.

Point being, it'd be weird to be watching yourself on the computer, raise your right hand, and see the "reflection" raise its "right hand" on your left side.
posted by explosion at 7:05 AM on December 21, 2009


Why does White Wanda get tagged, but not Black Desi?
posted by mccarty.tim at 7:05 AM on December 21, 2009 [1 favorite]


Okay, I don't want to ruin it for anyone, but it turns out . . . Wanda is dead.
posted by The Bellman at 7:09 AM on December 21, 2009 [2 favorites]


The video is humorous and HP’s webcam’s failure to recognize Desi’s face is likely the unintended consequence of several factors including inadequate testing, non-ideal lighting conditions, and insufficient QA, blah, blah, blah.

What I find interesting is the response of Mefis in thread. Many of us seem to be please Desi is not angry, that he’s actually funny, that he is not outraged at yet another example of the non-person status his skin confers upon him in the eyes of others.

In other words, people are cool with Desi as long as he doesn’t get pissed. Harriet Beecher Stowe wrote a several-hundred-page novel about this phenomenon and those of us who disdain the angry oppressed might want to revisit the post-1960s critical reception of the “novel that started the Civil War” to understand why your breathless relief at Desi’s humor is yet another example of unconscious racism.

I’m part black. I’m not pissed at HP for this failure of technology. I am irritated, though, that many of you who seem intelligent, thoughtful people are basically saying “I’m so glad he’s not uppity.”
posted by mistersquid at 7:14 AM on December 21, 2009 [36 favorites]


thoughtful people are basically saying “I’m so glad he’s not uppity.”

Really? I don't think that's what people are saying.
posted by zippy at 8:10 AM on December 21, 2009


thoughtful people are basically saying “I’m so glad he’s not uppity.”

Would this be where sharpener made an off-hand comment that he later retracted and apologized for? Seriously -- where else did you see this? Projecting... what?
posted by cavalier at 8:14 AM on December 21, 2009


thoughtful people are basically saying “I’m so glad he’s not uppity.”

Really? I don't think that's what people are saying.


Can you losers drop this thread? You're really making the whole site look like a bunch of up-tight ninnies.
posted by xmutex at 8:17 AM on December 21, 2009


In other words, people are cool with Desi as long as he doesn’t get pissed.

Wow. Those are some pretty arrogant assumptions on your part. I would apply that principle to anyone I come in contact with, white or black.
posted by KevinSkomsvold at 8:18 AM on December 21, 2009 [1 favorite]


xmutex: "Can you losers drop this thread? You're really making the whole site look like a bunch of up-tight ninnies."

Sorry but that ship has sailed.
posted by KevinSkomsvold at 8:19 AM on December 21, 2009 [2 favorites]


The default reaction to a perceived slight need not necessarily be anger. Especially when you're smart enough to recognize that there may be factors, in this case technological, at work which you don't understand.

Desi and Wanda used this video to raise a question. The accusation he makes at the end is clearly tongue in cheek.
posted by hermitosis at 8:23 AM on December 21, 2009 [4 favorites]


metafilter: a bunch of up-tight ninnies.
posted by empath at 8:31 AM on December 21, 2009 [1 favorite]


HP - MADE BY THE MAN!
posted by louieyak at 8:41 AM on December 21, 2009


“I’m so glad he’s not uppity.”

Gaak.

The guy is obviously angry/offended outraged. By making fun of the problem, HP's problem he shifts the spotlight away from him (as an individual) and onto the problem (this computer is racist - and anyway you want to slice it, no matter how much they try to explain it away with "... insufficient contrast in the blahblah blah blahohshitohshitohshit..." the computer does not recognize the black guy (it's so insane I can't quite wrap my brain around it)). Far more damning than if he makes about his hurt feelings or about re-stating the institutionalized racism endemic in America. We know his feelings are hurt, but he's also zinged the shit out of HP who should be scrambling like mad and who, if they're PR people have any brains at all, should hire Desi to be in their next series of ads. (granted they'll probably have him pop up out of a box and say, "No-sir-ee, no racists in this here box!" or maybe do a tap-dance number... well, look at their prior example.)

I don't really know what 'uppity' would look like, but I did not think he was denying or playing down any part of himself or his feelings in an effort to not put people off...
posted by From Bklyn at 8:43 AM on December 21, 2009


"Hi! It looks like you're trying to write a lett.... Um, hello? I could've sworn someone was... Oh hey, there you are! Hi! It looks like you're trying to write a letter."
posted by xedrik at 8:44 AM on December 21, 2009 [8 favorites]


I'm really amazed that face tracking software worked for anyone! I spent a couple of hours playing with it, and I couldn't confirm that it wasn't just picking random bits of the image. And I'm very, very white.

If it were my job to do this -- presumably on a tight budget and without adequate time to research the literature -- I'd start with a library of a bunch of faces, crop them to just eyes+cheekbones, and convolve the input image with the library until I get a sufficiently high score. That would definitely have some problems when the person in the image isn't holding their head perfectly straight, when their are two faces in the image, or if the person had a skin tone that wasn't represented in the library. So I think Wanda and Desi may have a valid point.
posted by miyabo at 8:55 AM on December 21, 2009


Why does White Wanda get tagged, but not Black Desi?

Added a few more tags (including desi). Post title is a (paraphrased) quote from Desi (see video).
posted by sharpener at 8:56 AM on December 21, 2009


What's fabulous about Desi and Wanda's reaction is not their anger or lack of it, but that they go the scientific method route, testing it out.

They just show the world the evidence and up til that last 'HP computers are racist' minute - let everyone else draw their own conclusion. That's scathingly funny in itself because, of course, an inanimate object can't be racist, but Nikitanikita's right - The way he tries to slowly slide back into the frame, which stops the tracking function, is killer. It's like that little racist computer is just saying, "Aw, hell no, Black man. Don't think I don't see you over there. Cause I don't".

And that's the moment, where if I was HP, well, that's the moment I'd be thinking "Oh, fuucccckkk...." and get myself to writing a response.

They could have both been super angry on camera, but if he'd still concisely and compellingly shown the nature of the problem, he probably still would have grabbed me. But both of them go the satire route, and by the time "white Wanda" enters the picture, you're kind of hooked.
posted by anitanita at 8:56 AM on December 21, 2009 [5 favorites]


I'd love to be a fly on the wall at the QA meeting for the facial recognition software where this video comes up.
posted by immlass at 9:18 AM on December 21, 2009 [1 favorite]


...more like Hewlett Wackard!
posted by orme at 10:05 AM on December 21, 2009


I was involved with someone who worked PR for a not to be named computer company. They were working on a promotion for MLK day. They drew up all sorts of slides and designs with various black people using their products. Evidently, it was then pointed out that almost none of their prior promotional material included any black people, and to include them on a holiday honoring a civil rights leader would be racist. So they went with a design that didn't include any people, just the products.

Interesting how racism can be so insidious and banal.
posted by elwoodwiles at 10:12 AM on December 21, 2009 [1 favorite]


Just watch. HP is going to fix it, but combine the face detector with a race detector and whenever it detects a person it thinks is black, it will switch Windows over to Windows In AAVE (voices by Steppin Fetchit) to better assist the user.
posted by ROU_Xenophobe at 11:16 AM on December 21, 2009


Hey - this is like the second time I've had an informed opinion on a Metafilter topic!

I've actually written racist software. It was a facial recognition system for a large corporation - one you've heard of. We were just doing the enrollment, gathering the initial photographs and facial templates (along with demographic information, etc), future applications were to be developed to actually use that data. Although, I suspect no use was ever made of the system - matching face images works so poorly that it's almost useless. Face-finding, on the other hand, usually works pretty well - we use it all the time to capture pictures that correspond to various ID photo standards. The software can tell if your facing forward, your eyes are open, you're wearing glasses, the face is centered in the image, your teeth are showing, there's a regulation 10% (e.g - I don't have the ICAO standard memorized or anything) of open space surrounding the face, etc.

As was pointed out above, you just buy the matching algorithm - there are a couple of commercial vendors out there - it's highly likely that HP uses the same algorithm that we did.

It's definitely a contrast issue. The face-finding algorithm needs to find facial features, and if it can't see them, it can't find a face. We calibrated the system on our developers and test staff, then tried it out on the entire company. And it couldn't even capture a template for our director of marketing, a darker-skinned african-american woman. You can't imagine how humiliating that was - I can still feel it.

It turns out, it's *really hard* to get adequate contrast for both dark skinned and light skinned people under normal lighting conditions. Even using a really good, professional quality flash, light-skinned people are going to be washed out or dark-skinned people are going to be a black blob. We eventually positioned insanely bright umbrella lights on either side of the subject, but that's obviously not a solution available to HP.

The problem isn't that Desi has dark skin - it's that his skin tone is markedly different from how the system is calibrated. White privilege is knowing that biometric systems are engineered with you in mind.
posted by bonecrusher at 11:18 AM on December 21, 2009 [22 favorites]


We eventually positioned insanely bright umbrella lights on either side of the subject, but that's obviously not a solution available to HP.

Maybe they could add a 200W halogen light, mounted on top of the webcam? I'm just spit-balling here, of course. If HP wants to talk numbers, they know how to find me.
posted by Marisa Stole the Precious Thing at 11:27 AM on December 21, 2009


biometric systems are engineered with you in mind.

And things like bandaids & haircut places and a bunch of other stuff. The bandaid thing has been addressed in recent years but I can still walk into many national haircut chains get that same "we don't see you" look from a real live human being.
posted by cashman at 12:07 PM on December 21, 2009


Desi is charming. Wanda is game. The situation is funny.

Sincerely,
TG_Plackenfatz, Black tastemaker

Racism solved!!!
posted by TG_Plackenfatz at 12:25 PM on December 21, 2009 [2 favorites]


Thank you for the anecdote, bonecrusher. You know, the US has a long way to go on this issue, but it has also come far. There was a time that it was unthinkable that an African-American person could be a corporate director. The omission may never have been discovered. Even if discovered, nobody directly involved would have to face an actual person triggering the omission as it happened. Maybe nobody would have felt that uncomfortable feeling.

Sure, maybe all HP did was pick a recognition algorithm out of a list, but caveat emptor. They put their name behind the product they sell. Let's not shift to their suppliers their responsibility for making a working product.

There is justification for upset here, whatever its magnitude. What must it look like to an average computer buyer? Average computer buyers aren't responsible for knowing about contrast, algorithms, industry standards, suppliers, and whatnot. The people in the video have the right to be upset. Maybe they are and maybe they aren't... if they aren't good on them for maintaining their composure, but the option is available to them without aspersion.

White privilege (PDF) and white privilege discussions previously on Metafilter: 1, 2, 3, 4, 5, 6
posted by halonine at 12:35 PM on December 21, 2009 [1 favorite]


This just pisses me off. Because like so many things lately... if you would stop laughing at the system and set it up correctly, you could stop fucking laughing!
posted by autodidact at 12:43 AM on December 21 [1 favorite +] [!]

So, you're saying... you'd teach yourself how to do it properly?
posted by Sebmojo at 12:41 PM on December 21, 2009


>: I am irritated, though, that many of you who seem intelligent, thoughtful people are basically saying “I’m so glad he’s not uppity.”

Are you kidding?
posted by dunkadunc at 12:43 PM on December 21, 2009


Ever since they started putting cameras in labtops this has been an issue. I remember getting my Macbook Pro back in 2007 and realizing that half of the photobooth filters did nothing for people with dark skin.
posted by Rubbstone at 12:46 PM on December 21, 2009


It has less to do with "uppity" and more to do with everyone feeling they need to rage on the internet in order to get attention. Color does not apply.
posted by june made him a gemini at 12:58 PM on December 21, 2009


Clearly these computer need to be fitted with face reading sonar.
posted by Artw at 1:05 PM on December 21, 2009


And things like bandaids

I've never really understood this claim. Yeah, Band-Aids are closer to "white" than "black," but they're not particularly close to actual skin-tone, unless you count "awful orange fake tan."

I had always just assumed that the color of Band-Aids was due to the plasticky material it's made of, just because if they actually wanted it to be "skin color," it'd be closer.
posted by explosion at 1:07 PM on December 21, 2009 [1 favorite]


What the hell is a labtop, anyway?
posted by dunkadunc at 1:15 PM on December 21, 2009


I had always just assumed that the color of Band-Aids was due to the plasticky material it's made of, just because if they actually wanted it to be "skin color," it'd be closer.

These products used to be marketed as "Flesh-toned". But they only came in one color.

Same with the Crayola crayon color, "Flesh".
posted by hermitosis at 1:16 PM on December 21, 2009 [1 favorite]


You can't really 'test' complicated machine learning based software like this, there's no way to test every possible input.

Was joking, FYI.
posted by davejay at 1:38 PM on December 21, 2009


seen today: levi's, toyota
posted by infini at 2:02 PM on December 21, 2009


I think the larger issue here is why you'd [i]want[/i] the camera to "follow" you the way it's apparently supposed to. I found the constant panning and zooming while Wanda was in the frame extremely irritating and mildly queasifying.

Although in fairness, I guess most people wouldn't be moving around like that when using the thing.
posted by Target Practice at 2:11 PM on December 21, 2009 [1 favorite]


I wish that software wouldn't recognize me. In that most facial recognition applications are so potentially intrusive and Big-Brother-ish. I'd say Desi scored.
posted by tkchrist at 2:27 PM on December 21, 2009 [1 favorite]


Same with the Crayola crayon color, "Flesh"

I always thought that colour should be much redder, perhaps with some white marbling ...
posted by nonspecialist at 4:37 PM on December 21, 2009 [2 favorites]


MSNBC: Is facial recognition software discriminatory?
posted by ericb at 5:32 PM on December 21, 2009


Photons are the most racist sub-atomic particle.
posted by Sparx at 5:37 PM on December 21, 2009 [2 favorites]


Photons are the most racist sub-atomic particle.

Shit. They ain't even a real particle.
posted by tkchrist at 5:41 PM on December 21, 2009 [4 favorites]


OK, now let's see if it works on Paul Karason.
posted by aeschenkarnos at 6:22 PM on December 21, 2009


MSNBC: Is facial recognition software discriminatory?

Nice of MSNBC to leave out any and all indication that Cryer was being facetious in calling the cam "racist". At least they linked to the YouTube vid in question, but come on.
posted by Marisa Stole the Precious Thing at 6:32 PM on December 21, 2009 [1 favorite]


I read this review of Sony's Robot Cam yesterday which also compares facial recognition to flakey voice recognition and notes it's "colour-blindness". From the article:

Nor was its face-recognition software foolproof. The robo-cam was thrown by decoys such as posters of TIME magazine covers, and it had an almost offensive tendency to ignore human subjects with dark skin tones. The WX1 in particular had trouble establishing what Sony refers to as "optimal picture composition," zooming in and out repeatedly on a motionless subject, like a morally divided Peeping Tom. And it can have fickle taste, sometimes snapping 20 shots of one target, sometimes ignoring someone standing right in front of it.

posted by goshling at 4:47 PM on December 23, 2009


« Older Born and raised in New Jersey, Jim McGuire was an ...  |  Fembot (overview 1 of 2);... Newer »


This thread has been archived and is closed to new comments