As if we all have the same online experience
December 22, 2015 7:15 AM   Subscribe

One day Harvard professor Latanya Sweeney googled herself with a reporter friend sitting next to her. An ad popped up inquiring about her arrest record. She had never been arrested. "It must be because you have one of those Black Names!" the friend said. "That's impossible," she replied, "Computers can't be racist." But then she started doing research.

Full presentation by the Ford Foundation on algorithmic bias.
posted by Potomac Avenue (60 comments total) 44 users marked this as a favorite
 
This is such a great FPP, thank you. My only concern is that it shouldn't turn out to be a double of some sort because the subject feels familiar? Was there a previously?
posted by infini at 7:21 AM on December 22, 2015


Kids, when I was your age we would argue with a straight face that algorithms were politically neutral and correlation wasn't causation.
posted by mhoye at 7:27 AM on December 22, 2015 [7 favorites]


You may be thinking about this post about Black Box Society.
posted by jedicus at 7:27 AM on December 22, 2015 [1 favorite]


Previously: Equations can't be racist
posted by andoatnp at 7:29 AM on December 22, 2015 [3 favorites]


Yes, this is great, thanks.

A few years ago, out of nowhere, I started getting ads like this on Facebook - asking me about arrest records, and advertising grills (not the method of preparing food, but the blinged out gold teeth). It was super weird and a dramatic deviation from the ARE YOU INFERTILE?? ARE YOU SURE?????? AND ALSO HOW ABOUT THIS ENGAGEMENT RING ISN'T IT PRETTY???? ads that I've been getting ever since I turned thirty. i hit the 'hide this ad' button every time they popped up, and eventually they went away. Recently, I did that thing where you can find out everything Facebook associates with you - and it turned out that Facebook had me listed as African-American, which I am not.

Can computers be racist? Fuck yeah, they can. Exactly as racist (and sexist and ageist) as the people who program them.
posted by pretentious illiterate at 7:35 AM on December 22, 2015 [55 favorites]


Recently, I did that thing where you can find out everything Facebook associates with you -

Please advise.
posted by infini at 7:38 AM on December 22, 2015 [4 favorites]


Probably both, jedicus and andoatnp, since I notice my ...ahem... participation in those threads.
posted by infini at 7:41 AM on December 22, 2015


People who program computers can be racists, so it's no stretch that a human-produced computer algorithm can be racist.

Computer searches, and big data can do "fuzzy logic" searches, pattern matching etc but it's always humans driving computer decisions.

That said, it's not perfect. I was getting ads sent my way about reducing the costs on my executive helicopter, and I am middle class.
posted by Deep Dish at 7:44 AM on December 22, 2015


The Q&A in the video is very amazing.
Someone asks what the reaction is among engineers when confronted by these kinds of facts.
Alvaro Bedoya says "Oh they often change things when confronted by them. Facebook immediately removed the payday ads that get served to their black users..."
But then Prof. Sweeney goes... "Sure, because payday loans aren't really a piece of their revenue stream. They are a lot less exited about letting you hide your location."

Really annoys me how up in arms people get about the government storing their email somewhere but think it's totally fine that everyone has a device in their pocket that businesses can (and do) use to discriminate against various groups, including for instance, sick or disabled people.
posted by Potomac Avenue at 7:48 AM on December 22, 2015 [6 favorites]


Facebook's guesses about you: Ad Preferences
posted by stobor at 7:49 AM on December 22, 2015 [18 favorites]


A few names did not follow these patterns: Dustin, a name predominantly given to white babies, generated an ad suggestive of arrest 81 and 100 percent of the time.

I wonder what the story with that is. Have there been some troublemaking Dustins skewing the statistics to the point that ad-targeting algorithms picked it up?
posted by acb at 7:50 AM on December 22, 2015 [1 favorite]


Yes, Infini, stobor's got it.

More info here.
posted by pretentious illiterate at 7:52 AM on December 22, 2015 [4 favorites]


Have there been some troublemaking Dustins skewing the statistics to the point that ad-targeting algorithms picked it up?

Dustin Diamond.
posted by maryr at 7:53 AM on December 22, 2015 [3 favorites]


Um, Facebook might think I am an astronaut?
posted by maryr at 7:58 AM on December 22, 2015 [3 favorites]


Another previously: Search algorithms have learned our nefarious ways, about gender differences in ads for CEO jobs.
posted by Dashy at 7:59 AM on December 22, 2015


There should be a paid app that periodically spews false data onto the internet about you that you could sign up for. Once enough people have signed up for it, data brokering becomes useless, or at least so expensive that it's no longer ridiculously simple to pinpoint someone's demographic and use it against them
posted by Potomac Avenue at 8:00 AM on December 22, 2015 [29 favorites]


Nothing racist about Adblock.
posted by straight at 8:00 AM on December 22, 2015 [7 favorites]


Watch the video, they get asked about that. Adblock blocks you from seeing the ad, you're still being profiled and your activity tracked. It's just hiding it from you.
posted by Potomac Avenue at 8:02 AM on December 22, 2015 [5 favorites]


I'm trying to remember if I've ever met a Dustin that DIDN'T have an arrest record...
posted by elsietheeel at 8:02 AM on December 22, 2015 [1 favorite]


Huh. Under Ad Preferences, Facebook has me as "Politics (Very Liberal)"

This is interesting to me, mostly because I don't think I've liked anything on Facebook beyond a few friends' posts and some bands I liked on the first day I set it up.

Don't get me wrong, (Very Liberal) is accurate, I just don't get how Facebook knows that.
posted by Mooski at 8:03 AM on December 22, 2015 [1 favorite]


Private Eye magazine runs a short feature each issue ironically entitled "Malgorithms - excellence in contextual advertising." From this fortnights issue:

CBS News Headline: "Arrests after 'act of terrorism' against Black Lives Matter.
Accompanying ad for eBay: "Black Friday. Great deals all week, #MyLittleBigVictory"
posted by marienbad at 8:06 AM on December 22, 2015


Don't get me wrong, (Very Liberal) is accurate, I just don't get how Facebook knows that.
Probably inferred from who you're friends with.
posted by LogicalDash at 8:06 AM on December 22, 2015 [3 favorites]


But if everyone used Adblock, that profiling data would be worthless,and there would be much less incentive for corporations to collect it.
posted by straight at 8:08 AM on December 22, 2015


Of course computers can be racist.
posted by delfin at 8:10 AM on December 22, 2015


But this is the same inane process that produces Netflix' suggestions - "Hey, you watched 'Sherlock', so you're going to love 'Transformers'!" I'm not sure it actually means anything.

pa: There should be a paid app that periodically spews false data onto the internet about you that you could sign up for.

Sorry I can't find it now, but I heard a radio piece about two apps that do something like this. One clicks on every ad that shows up in your browser even if they're hidden, and the other spews recycled data everywhere you go.

And then there's this guy: "But online advertising is a train wreck waiting to happen."
posted by sneebler at 8:12 AM on December 22, 2015 [3 favorites]


Facebook has listed one of my hobbies as "stress". About right.
posted by one of these days at 8:20 AM on December 22, 2015 [8 favorites]


Added point of info: if you go to a page with a "like us on Facebook" button, Facebook knows you were on that page.
posted by idiopath at 8:24 AM on December 22, 2015 [7 favorites]


What matters is whether it's fair to just leave things functioning in a way that makes a whole lot of people feel alienated and angry. The question of whether algorithms can be fair or unfair is not that interesting.

Maybe it's time we started to really think hard about what it means to be "fair". The idea of fairness is so important, so fundamental, that studies with (I think) dogs and primates have demonstrated that even they react to perceived unfairness.

At the same time, nature isn't "fair". Some people, and some animals, are just born in a better (seeming) position than others: healthier parents, beauty, straighter teeth, better prenatal nutrition. That's extremely upsetting whether you are among the more fortunate or the less fortunate. Extremely upsetting in humans can lead to things like depression, anger, hate, and just completely giving up. Our whole way of life depends on believing that most people at least wish life to be fair, and that they are willing to go at least a little out of their way to make it fair.

It can be infuriating to hear an authority figure saying, either to you or to someone else in front of you, "Well, honey, life's not fair." The implication is that you have to just accept this, swallow your anger, do what other people think you should do, and be grateful that you're allowed to exist and to have what little joy you're granted. I've thought a great deal about this. Maybe nature isn't fair, but we are powerful creatures and we don't just have to accept that.

Perhaps the purpose of civilization is to work as hard as we can to make things more fair. Perceived or real unfairness can lead to extreme anger, conflict, even wars of conquest. Life, nature, and some people are so unfair, so incredibly unfair, that any space without that kind of conflict is a kind of miracle -- and yet we've mostly trusted each other, and even loved each other, and built things together. How have we managed to do this in the face of massive unfairness?

I think that what's helped is the efforts of countless people -- teachers, friends, acquaintances, strangers -- to impose fairness on an unfair world. Is that kid too shy to push her way to the class cookie plate? Make the kids go in an orderly fashion. Is that other kid from a home that hasn't taught him to read before school starts? Get him extra help. There are so many reasons this is important, even though the reasons aren't always articulated well.

All that is to say that it does take extra effort to be fair, and that extra effort is necessary. Whether algorithms can be racist or not, it's probably not _fair_ to make people feel bad just because of their name, or their friends, or their interests. We may have to add something extra to make life more fair, to make the algorithms treat people fairly. However, it matters a lot.
posted by amtho at 8:36 AM on December 22, 2015 [6 favorites]


Garbage in, garbage out.
posted by rtha at 8:39 AM on December 22, 2015 [2 favorites]


And for the first time in a very long time, I'm taking a look at my facebook ad preferences, and under Business and Industry, listed orgs are split pretty evenly between queer stuff and bird stuff. Surprise!
posted by rtha at 8:42 AM on December 22, 2015 [2 favorites]


Just looked at my ad preferences on Facebook, and holy shit are they wrong on me, but man that explains why the ads are always bizarrely disconnected from anything I'd like. For some reason Facebook thinks I'm conservative and am interested in the Sean Hannity show.. nope.
posted by KirTakat at 9:05 AM on December 22, 2015


Added point of info: if you go to a page with a "like us on Facebook" button, Facebook knows you were on that page.

Privacy badger helps with that.
posted by infini at 9:06 AM on December 22, 2015 [3 favorites]


I looked up that FB stuff. Its so funkily out of whack - its got my primary browser wrong and I've never downloaded the one they have listed for me - that imma just going to leave it be that way.
posted by infini at 9:09 AM on December 22, 2015


Putting myself back on track, I'm not entirely surprised to find that a learning algorithm picked up racist associations. Machine learning algorithms are more or less actively trying to confirm biases. There's a kind of positive feedback loop where you are suggested mostly things that are similar to what you already like or think, since that's the best way to get you to click something. Then the machine brain uses your liked or clicked-on stuff to determine whether to offer that same correlation to another person. So you get these closed-in groups of strongly-associated terms that cluster together more like discrete bubbles than a wide web. The thing is that the strengths of these correlations are directly and indirectly influenced by the suggestion algorithm itself.

So if we have an algorithm learning from reading data or human activity that shows a racist bias, and people engage with that bias, the algorithm will learn that confirming this bias an effective way to get clicks. Then it'll display this sort of thing more often to other people, further entrenching the bias in others. It's both a symptom and a cause at once.
posted by one of these days at 9:12 AM on December 22, 2015 [8 favorites]


Heh. Facebook apparently knows next to nothing about me. Aside from OS/browser, it has me as Gen X (correct), family-based household (nope!) and newlywed (nope!)

Facebook shall continue to know little about me, because these days I see a lot less updates about my friend's and families' lives, and a whole ton of reposted shit from friends of friends. It just isn't worth visiting FB any more.

Edit: FB and YouTube both serve up the criminal record expunging ad frequently. I think they believe everyone is a criminal by default .
posted by five fresh fish at 9:18 AM on December 22, 2015 [1 favorite]


The people Facebook associates with me:
Woodie Guthrie
Akbar (as in the 3rd Mughal emperor, not the Admiral from Star Wars)
And no one else
I am both pleased and very confused.
posted by vorpal bunny at 9:24 AM on December 22, 2015 [3 favorites]


Mod note: Totally fine to talk about but at this point, maybe folks' findings from "what does Facebook think I'm into" can go over to the open thread about that, and we can kind of nudge this one back onto the subject of the links.
posted by LobsterMitten (staff) at 9:27 AM on December 22, 2015 [5 favorites]


Can computers be racist? Fuck yeah, they can. Exactly as racist (and sexist and ageist) as the people who program them.

One of these days got this but while you often see the issue framed as you did arguably the scarier possibility is that these days the algorithms are "learning" to be racist (just like the rest of us).
posted by atoxyl at 9:42 AM on December 22, 2015 [2 favorites]


Great, so just like with people, no matter how much computers claim to not even see color because they were raised to treat everybody the same, the truth is that they sure as hell see it and make all sorts of judgements as a result.
posted by lord_wolf at 9:43 AM on December 22, 2015 [1 favorite]


I looked up that FB stuff. Its so funkily out of whack - its got my primary browser wrong and I've never downloaded the one they have listed for me - that imma just going to leave it be that way.

Probably should change your password if you haven't recently!
posted by mantecol at 9:44 AM on December 22, 2015 [2 favorites]


Well, if police dogs can learn to bark at black people (presumably from conscious or subconscious feedback from their handlers), it's not so far-fetched.
posted by acb at 9:44 AM on December 22, 2015 [2 favorites]


So if we have an algorithm learning from reading data or human activity that shows a racist bias, and people engage with that bias, the algorithm will learn that confirming this bias an effective way to get clicks. Then it'll display this sort of thing more often to other people, further entrenching the bias in others. It's both a symptom and a cause at once.

I wonder if this is actually a primary driver behind all racism. Children are pattern-recognition engines, just like machine learning algorithms are.
posted by mantecol at 9:51 AM on December 22, 2015 [4 favorites]


I suspect there's a slight innate tendency to 1) group things and people by obvious visual similarities, and 2) classify people into My People and Other People. That doesn't turn anyone into a rabid racist right off, but if every generation learns a bit from the one before (and they can't help but do so), pretty soon you've got a self sustaining system.
posted by echo target at 10:13 AM on December 22, 2015 [1 favorite]


That full presentation video is just fantastic, and while the topic is chjlling, I found it comforting knowing that people like Latanya Sweeney are on this. She is really great.

Superficially, you could say that no, computers are not racist or sexist or ageist or whatever. By the same token, you could say that neither is redlining. Neither is explicitly discriminating based on protected categories, but rather on factors that just so happen to strongly correlate to them. Having machine heuristics do the discrimination adds an extra layer of deniability, though. You could probably prove undeniably that a computer was not making decisions based on those categories, because you can explicitly exclude them, which you can't with humans. The problem is that learning systems learn from us, and they are simply codifying the bigotries inherent in the systems they're learning about.

However, intelligent systems, as far as I'm aware,* still require a fair amount of human oversight and control. Ad networks obviously are pretty dependent on prescriptive controls still, but the trend toward descriptive models has been progressing very rapidly and with almost no regulatory oversight at all.

The systems need explicit legal controls, and very very importantly, they need smart, diverse people overseeing and evaluating those systems and the systems they control. Homogeneous groups of people always miss all kinds of stuff, in both natural and artificial systems, and it's often not intentional. Sometimes, you need someone named Latanya to even notice it in the first place.

* I've done work in narrow AIs doing natural language processing and predictive modeling, but it's been a minute and it was mostly in a heavily regulated industry with a lot of close judicial oversight, so if someone more familiar with the state of the art right now says I'm wrong, they probably win.
posted by ernielundquist at 10:30 AM on December 22, 2015 [3 favorites]


But if everyone used Adblock, that profiling data would be worthless,and there would be much less incentive for corporations to collect it.

When you go to WebMD to look up heart disease symptoms, WebMD tells health insurance companies you're doing so, because capitalism.
posted by sebastienbailard at 11:13 AM on December 22, 2015 [3 favorites]


So online ad networks have very high automated algorithmic components but they're also driven by manually set targeting elements that are definitely super racist. People run online ads for all sorts of offensive stuff and the crypto-racist stuff like "grills" is just the stuff that skirts the line enough to get by. Having worked in online ads there's a lot of really offensive stuff that does get removed by policy and there's lots of other offensive stuff like human trafficking etc.
posted by GuyZero at 11:39 AM on December 22, 2015


Have there been some troublemaking Dustins skewing the statistics to the point that ad-targeting algorithms picked it up?

A white guy named Dustin burglarized my grandma's house. So uh... maybe?
posted by adecusatis at 11:54 AM on December 22, 2015


They discuss ad blocking. Short version: Blocking ads doesn't do shit about the real issues. It just keeps you from seeing ads.

Sweeney has a rare combination of technical and cultural understanding combined with the ability to explain both clearly, and the full talk is one of the best and most comprehensive discussions on these things I've seen so far.
posted by ernielundquist at 11:57 AM on December 22, 2015 [1 favorite]


Just to clarify my comment above - automated algorithms behind the scenes will try to optimize who to show ads to to get the best clickthrough rate, but usually the ads have targeting manually set to some group of people. The person selling "grills" probably manually clicked a box somewhere that said "show this ad to black people".

Now, Facebook making crazy guesses about who is and is not black is a whole other issue.
posted by GuyZero at 12:07 PM on December 22, 2015 [2 favorites]


crypto-racist stuff like "grills"

How is this so? You mean like barbecue grills? Teeth grills?
posted by amtho at 12:50 PM on December 22, 2015


One other thought: I'm interested in reading things that I _don't_ agree with, written by people I don't understand very well -- in order to try to understand them better. If other people are doing this (and I hope they are), that could have interesting effects -- potentially troubling effects -- on the massive dataset that we all seem to be creating about ourselves. Inquisitive people of good will could seem to have potentially troubling belief systems, for example.
posted by amtho at 12:52 PM on December 22, 2015 [1 favorite]


stobor: "Facebook's guesses about you: Ad Preferences"

Oh that's fun! Facebook thinks I like horse meat (food) and Sean Hannity. Also ventilation (architecture). And dental floss*.

*Shout out to bswindon!
posted by stet at 12:56 PM on December 22, 2015


Wow, Facebook's interest inference system is even worse than Google's.

Under "Fitness and Wellness" my 3 interests are:

Bike-to-Work Day
Illusion
Crying

It's poetry.
posted by GuyZero at 1:18 PM on December 22, 2015 [6 favorites]


IMO, you have to focus on outcomes. So they're just guessing about your race and your sex and family status and stuff, but the results are just a slightly less accurate version of your demographic profile. It's still real discrimination, though, and we ignore it at our peril.

And it needs to be taken in context with this kind of data collection. It's not just online ads and Facebook trying to guess things about you. There are massive date broker operations that are collecting huge amounts of data on you that you have no knowledge of or control over and making varyingly educated guesses about your personal life, and there are very limited controls over what they can do with this.

They're selling lists of people based on suspected race, age, gender, medical condition, and a whole host of other things, and there's precisely fuck all you can personally do about it. They're tying your online activities to your real world activities, and it's not just inconsequential stuff.

And as she points out in the talk, the internet of things is really going to have to be a tipping point. People now regularly walk around with listening devices in their pockets so that pretty much all of your friends are wearing wires (and I've wondered when cop shows are going to start adapting their storylines to that). Your TVs and your kids' toys are compiling data on you and reporting it back to random electronics manufacturers, who are then allowed to do whatever the hell they want to with it. You can take all the measures you know how yourself, but are you going to pat down anyone who comes to your house and confiscate their phones? Their kids' Barbies? Aw, hell, I have an idea. Pacemakers and insulin pumps are notoriously insecure and modifiable. How easy would it be to just throw a mic and a longer range antenna in there? (Yes, I am available for employment opportunities, and surprisingly affordable. Call me.)

Most people are pretty much overwhelmed by either the technical aspects or just the futility of it, but the fact is that you have very little power over the information that is floating around out there about you. The only effective way to address it is through collective action, oversight, and very tight regulation with lots and lots of teeth.
posted by ernielundquist at 1:32 PM on December 22, 2015 [5 favorites]


I'd think that the ads would be more likely to target people with the deadliest middle name.
posted by Halloween Jack at 2:22 PM on December 22, 2015 [1 favorite]


stobor: "Facebook's guesses about you..."

There's gold in these hills.

For hobbies & activities:
  • Guitar (no)
  • Reason (?)
  • Friday (??)
  • Wheel (??!)
  • Wave (???!)
  • Livestock (?!)
  • Plant (!?)
  • Duck (...)
  • Tertiary sector of the economy (wha?)
posted by flippant at 3:59 PM on December 22, 2015 [1 favorite]


hey everybody, it turns out i'm really into trucks and corporate finance so if you're still doing your last minute xmas shopping there you go
posted by cortex at 4:35 PM on December 22, 2015 [6 favorites]


Do androids dream with Martin Luther King?
posted by thetruthisjustalie at 4:54 PM on December 22, 2015


So the way Adwords works is kinda complicated, and I'm not surprised about the results. AdWords allows buyers to bid for clicks on ads presented for a given search, and presents ads (impressions) to searchers based on click through rates. There's plenty of room here for the algorithm to reflect society's racism, both overt and unconscious. I find it interesting to reflect on how that might be, so that we can better improve the outcomes.

Firstly, the buyers, and their ads. They could be racist by only bidding for ads on black names. The CEO of arrestrecords.com could straight up tell a programmer to put in a bid for every black name combination there might ever be. But they could also form a sort of floor for names, bidding the same on some set of names, and simply lose to higher bidders in ways that break down along racial lines. For example, a luxury brand attempting to market itself to the upper quartile of income is also a racially discriminatory filter. In the context of names, you might imagine Ancestry.com bidding up names in their database, which primarily comes from their affiliation with LDS and would also skew white. So their white userbase would mostly attract more white users.

There's a second dimension however. Unlike impression based advertising, Google sells clicks. If you bid high and nobody clicks, your ad gets taken offline. It could be that click through rates for white name arrest records are lower. So if you search a name, and click through differently based on race, that'll be feed back into the system to show that ad more often. Reasons you may respond differently may be only indirectly related. For example, you might recognize Justin Tompson as a common name. But Laquesha Wyman? If you and the society you live in don't mix, you might not realize how common or rare the name is.

Those two dimensions determine how Google optimizes its revenues. There's a third dimension, which is how profitable a visitor is. If black names are more profitable, buyers will be willing to outbid competition. This depends on how these places make their money. As I understand it, they're a legal blackmail factory. If you have access to more expensive, effective means of fighting records, you may be less willing to pay their blood money to remove your name / photos.

The best way to really know whats going on among these possibilities is to have more market data. If we knew who bid on Prof. Sweeney's name, what other keywords they bought, and who else was bidding on her name, remedies would be easier to come by. And Google would be a hell of a lot less profitable.
posted by pwnguin at 6:41 PM on December 22, 2015 [2 favorites]


the crypto-racist stuff like "grills" is just the stuff that skirts the line

What percentage of the target market for “grills” is African-American? If it's particularly high, is targetting ads for “grills” at black people more like targetting ads for, say, hair products specific to African-American people at black people, or more like targetting ads for Kid Rock albums at white people?
posted by acb at 2:51 AM on December 23, 2015


« Older 2015’S Biggest Albums, Courtesy of 1-Star Amazon...   |   Die Hard is not a Christmas movie Newer »


This thread has been archived and is closed to new comments