Is Siri, the new iPhone 4s voice recognition software, tone deaf?
November 29, 2011 11:18 PM   Subscribe

Siri Can't or Won't Search for Certain Things Is this on purpose? You decide
posted by Splunge (293 comments total) 20 users marked this as a favorite
 
I just Googled "Open the pod..." and Google read my mind. Try it.
posted by weapons-grade pandemonium at 11:36 PM on November 29, 2011 [2 favorites]


This is a remarkably oblique title and summary (even, perhaps, a bit too much so?).

It's weird, because either they had to design in all the other medical references (like they did with all of the jokes, certainly), or they had to design out this one. I'm not sure which one is more likely.
posted by Han Tzu at 11:39 PM on November 29, 2011 [4 favorites]


This is pretty sad.

Reminds me of apple's censorship of porn apps on their app store, this is obviously much worse.

Kind of ridiculous that we are going to need to litigate companies into behaving properly in these matters. This is effectively censorship of free speech, but at a corporate rather than government level.

Also I wonder where this leaves all the post feminists plodding away on mac books at my local coffee shop.
posted by sourbrew at 11:41 PM on November 29, 2011 [3 favorites]


Han Tzu,

I would wager the second. Apple probably rightfully assumes that blow-back from pro-women's reproductive rights organizations will be less than family values voters worrying over their tween getting factual advice about reproductive options.
posted by sourbrew at 11:42 PM on November 29, 2011 [18 favorites]


Maybe they got worried someone would ask Siri about abortion services and then use that info to go harass/attack providers. (I sort of doubt it).
posted by nat at 11:45 PM on November 29, 2011


Gah. Yeah, this looks to be intentionally blocked, which is awful. I imagine the "Really!" response to "I was raped" was a very, very unfortunate accident, however.

This is messed up.
posted by Navelgazer at 11:45 PM on November 29, 2011 [6 favorites]


I haven't made a lot of FPPS. I was trying to be as objective as I could considering the subject.
posted by Splunge at 11:47 PM on November 29, 2011 [3 favorites]


This is effectively censorship of free speech, but at a corporate rather than government level.

It's a phone. They can do what they want with it. You don't have to buy it. It's not a free speech issue.
posted by jimmythefish at 11:52 PM on November 29, 2011 [32 favorites]


I wonder if the late Mr. Perfect Magic Genius had a hand in this political decision?
posted by univac at 11:52 PM on November 29, 2011 [12 favorites]


Maybe they got worried someone would ask Siri about abortion services and then use that info to go harass/attack providers. (I sort of doubt it).

Actually, I think this is probably not all that far off. I mean, how many people are relying on Siri for an abortion? Hey Siri, I need to get some wine and oh how about an abortion place on my way home?

You can get Google data from services on the phone - it's not a comprehensive blockage of information. I think they may have simply chosen to stay out of that one. A lot of clinics don't like to gather attention and it may have been a nod to that.
posted by jimmythefish at 11:59 PM on November 29, 2011


Taking a note from Google, Apple has stated that SIRI is in Beta.
posted by mrzarquon at 11:59 PM on November 29, 2011 [1 favorite]


There's another possibility: remember that iOS devices are getting used a lot in the medical field these days. And I don't know if y'all have noticed, but there are a heck of a lot of gag rules and laws regulating exactly what medical providers can and can't say to women seeking abortions...
posted by ubernostrum at 12:01 AM on November 30, 2011 [7 favorites]


I want to say I hope they fix this. Unfortunately, "fix" implies that they know it's a problem. It sure seems intentional, and it kind of disgusts me. Shame, apple. Shame.
posted by Weeping_angel at 12:01 AM on November 30, 2011 [1 favorite]


(and that's without getting into the mess of what happens when an iPhone is given to a girl under the age of 18, and even more laws come to bear)
posted by ubernostrum at 12:02 AM on November 30, 2011


Siri is in beta? Hasn't it been marketed quite heavily already? (I'm not in the US, but I've seen some TV spots on YouTube.)
posted by Harald74 at 12:04 AM on November 30, 2011 [1 favorite]


I suspect this is a mistake. I hate Apple as much as the next right minded person, but I've no doubt that they're not deliberately witholding resources from rape victims. File a bug report, and wait for the update.
posted by seanyboy at 12:06 AM on November 30, 2011 [4 favorites]


"I was raped."
"Really!"




Wow. That made me pretty sick.

If it were just abortion services, I would be more willing to give them the benefit of the doubt. But it's also questions about any birth control at all, plus this.
posted by louche mustachio at 12:10 AM on November 30, 2011 [4 favorites]


The rather ugly-in-context response to "I was raped" ("Really!" "Is that so?") is one of the reasons why cutesy canned responses are a bad idea for this kind of software. This isn't Zork -- the response to this should be the same as the response to any other question the software doesn't understand... which should be "Sorry, I didn't understand". Is there really any use-case in which customers are delighted to receive snarky back-talk from their phone?

About the "family values" thing: it's worth noting that Siri will gladly help you find Viagra, dildos, strip clubs, or an escort service, among other things (ask about marijuana and it answers with head shop locations!) This does seem to be a pretty specific problem -- it won't even tell you where to get birth control. As in, an acceptable answer would be Walgreens.
posted by vorfeed at 12:11 AM on November 30, 2011 [57 favorites]


who cares if apple controls what iOS? it's much better than everything else.

meta: metafilter won't recognize my html ending sarcasm tag
posted by cupcake1337 at 12:12 AM on November 30, 2011 [3 favorites]


I posted this in the Carrier IQ thread when someone brought this up:
That's actually a really serious issue with these smartphone AIs. With unintelligent tools there isn't as much of a 'moral' dimension, they do exactly what you ask. But with smart agents, especially ones hosted in central 'clouds' it's more problematic. What happens if the phone detects you might be trying to commit a crime? But what about drug related queries?

Politically sensitive queries are a whole other issue, Abortion is probably an oversight. Suppose someone looks for prostitution in Nevada, what should Siri say? Should an intelligent agent simply try to help you accomplish whatever you're trying to do to the best of it's ability, or steer your behavior to socially productive ends?
Anyway, I didn't think this was intentional, it seems like an oversight. One problem with this is that when you're generating new responses you have no way of knowing if the AI is going to generate anything offensive. You can't cover everything and you can't expect it to have human level understanding of context.

Looking at the screenshots, though it seems like abortion providers are simply not included in the database, which is weird...
posted by delmoi at 12:18 AM on November 30, 2011 [5 favorites]


after a full read, it's pretty damning. it would take a long explanation from apple to convince me that this kind of functionality, searching for abortion, birth control, morning after pill, rape resources, wasn't explicitly coded to ignore.
posted by cupcake1337 at 12:19 AM on November 30, 2011 [26 favorites]


Maybe they got worried someone would ask Siri about abortion services and then use that info to go harass/attack providers. (I sort of doubt it).

They be crazy enough to bomb a clinic, but lack the wherewithal to type a search term into Google by hand? Seems a bit ludicrous.
posted by His thoughts were red thoughts at 12:19 AM on November 30, 2011 [5 favorites]


"It's a phone. They can do what they want with it. You don't have to buy it. It's not a free speech issue."

That's not quite true, it is their platform, but when you have the size of a user base they do they are censoring speech.

Sure people have bought into that censorship, but that doesn't make it any less chilling. We had to fight governments for that right, and now it seems we will be fighting companies.
posted by sourbrew at 12:20 AM on November 30, 2011 [18 favorites]




This is pretty sickening.
posted by maxwelton at 12:25 AM on November 30, 2011 [1 favorite]


after a full read, it's pretty damning. it would take a long explanation from apple to convince me that this kind of functionality, searching for abortion, birth control, morning after pill, rape resources, wasn't explicitly coded to ignore.

Incidentally, it works with "define abortion", or "google abortion".
posted by His thoughts were red thoughts at 12:25 AM on November 30, 2011 [1 favorite]


I knew it.
posted by Blazecock Pileon at 12:26 AM on November 30, 2011 [3 favorites]


6. What to do if a hamster is caught in your rectum: in D.C., she’ll direct you to Charming Cherries Escort Service.
Heh....
posted by delmoi at 12:30 AM on November 30, 2011


That's not quite true, it is their platform, but when you have the size of a user base they do they are censoring speech.

The iPhone has less than a 20% share of the smartphone market.
posted by brightghost at 12:33 AM on November 30, 2011 [2 favorites]


"...it would take a long explanation from apple to convince me that this kind of functionality, searching for abortion, birth control, morning after pill, rape resources, wasn't explicitly coded to ignore."

Sure seems so. Siri isn't pre-loaded with extensive databases on everything like Wolfram Alpha. For queries like street addresses it must be searching public sources, so if it can't find abortion clinics those results must be blocked somewhere in the cloud before it replies.
posted by Kevin Street at 12:34 AM on November 30, 2011 [1 favorite]


This looks intentional. And if it is, good. We need to make Americans aware that it can indeed happen here. With global corporations, it doesn't matter what country you're in, information filtering can take place anywhere.

Should an intelligent agent simply try to help you accomplish whatever you're trying to do to the best of it's ability, or steer your behavior to socially productive ends?

This is a really astute observation. We're going to need to deal with a new future where information technology censorship is more than just an on/off switch. Subtle incentives can be designed into these systems, in some cases using all the power of natural language, to encourage or discourage behaviors the designers want.
posted by formless at 12:36 AM on November 30, 2011 [5 favorites]


If I tell Siri "I need IVF", it replies "Sorry, I couldn't find any infertility specialists.", so it won't help women get pregnant either. It obviously understands the query.

Oddly if I say "I need a condom", Siri does provide a list of drug stores.

I wonder if this is deliberate censorship or just an error. From my time working on web search, I remember that in every case where people were angry about what looked like deliberate bias in search results, it turned out to be the result of a bug or bad data in the system.
I can't really tell if Siri's database of female reproductive health services was deliberately or accidentally deleted, or in fact never existed because they haven't got around to importing the data yet,
posted by w0mbat at 12:36 AM on November 30, 2011


Siri uses Yelp for its location data.

Yelp doesn't know of any abortion clinics near pittsburgh.

If you try changing the location, Yelp gives results in New York, for example.

This isn't some Apple pro-life conspiracy. The data just isn't on Yelp.
posted by amuseDetachment at 12:36 AM on November 30, 2011 [48 favorites]


"The iPhone has less than a 20% share of the smartphone market."

How many millions of users is that, and for how many of those millions of users is the iPhone their primary computer?
posted by sourbrew at 12:36 AM on November 30, 2011 [2 favorites]


#amazonfail part deux
posted by Blazecock Pileon at 12:39 AM on November 30, 2011


Interesting find, amuseDetachment! It suggests that this sort of censorship is more common than just Apple.
posted by Kevin Street at 12:41 AM on November 30, 2011 [2 favorites]


Siri uses Yelp for its location data.

Yelp doesn't know of any abortion clinics near pittsburgh.


If you search for "abortion" alone Yelp finds Planned Parenthood and Allegheny Reproductive, among others. Both of which were explicitly searched-for in the examples, including by street address.

Of course, Yelp also finds a bar and grill and a donut shop, but what can you do?
posted by vorfeed at 12:43 AM on November 30, 2011 [11 favorites]


So it's not Yelp. Time stop posting for the night.
posted by Kevin Street at 12:48 AM on November 30, 2011 [1 favorite]


vorfeed: Interesting, it looks like Yelp has two categories for abortion. If you type "abor" in search there's autocomplete for "abortion clinic" and "abortion". It's possible that Siri defaults to the closest match, which is "abortion clinic". This is a common problem when you're doing machine learning, whether n-gram categorization (via k-means clustering or what have you) with the 1-gram or 2-gram is sufficiently differentiated enough to have separate sections. I'd still wager that it's probably Yelp dropping the ball here, rather than Apple.
posted by amuseDetachment at 12:48 AM on November 30, 2011 [3 favorites]


I was under the impression it used Wolfram Alpha also.

Just ran a search on my city my state abortion services and got nothing but linguistic data, including the "OMFG I'm pregnant, what do I do? Oh, look at that! Services is a 13 score in both International AND American English Scrabble!"

I think people need to remember that Siri is just a front end to data mining, not the data miner itself.
posted by Samizdata at 12:48 AM on November 30, 2011 [1 favorite]


I don't pretend to know much about this, but Yelp has a category for Obstetricians and Gynecologists with tons of listings in Pittsburgh. Maybe "Abortion Clinic" is not what abortion clinics call themselves and therefore does not qualify as an appropriate search term.

Also, I'm a little hard-pressed to believe that Siri would be conceived as filtering politically sensitive information from the get-go but maybe I'm just being naive. Apple has never struck me as being representing or protecting ideologically conservative ideals. This curious mis-step would be a first. Big picture, I suppose this type of thing is inevitable.

I'm really convinced we are in the twilight of the golden age of free information and things are going to be in the shitter in a matter of a few years.
posted by phaedon at 12:48 AM on November 30, 2011 [2 favorites]


Also, you can tell it's actually doing a search Yelp for "abortion clinics" based on this picture.
posted by amuseDetachment at 12:50 AM on November 30, 2011 [1 favorite]


I was under the impression that Apple's customer base skewed to the left politically. I'm surprised they wouldn't catch this ahead of time. I'd accept the Yelp explanation more if Siri didn't blow off issues of rape and report only anti-abortion clinics.

Perhaps we'll see a heartfelt mea culpa with no excuses and quick action. That would be the most convincing argument for "Whoa. BIG oversight here" being the case. After that Android post below, it would be a welcome reassurance to think not all smartphones are harbingers of evil.
posted by Saydur at 12:51 AM on November 30, 2011


Anyway, I didn't think this was intentional, it seems like an oversight.

I honestly see no possible way for this to have been an oversight, if only because it's so all-encompassing with regards to women's health. Birth control, abortion, rape... And I don't think that it's coincidental that those are hot-button issues for a lot of people. Note that many of those queries don't even have the (apparently default?) option to go to a web search when Siri can't find anything--someone had to set that behavior.
posted by MeghanC at 12:51 AM on November 30, 2011 [5 favorites]


(Acutally it'd probably be using Latent Semantic Indexing, but this is a VERY VERY common problem in Information Science and Statistics. Sorry about the many posts.)
posted by amuseDetachment at 12:53 AM on November 30, 2011


Is this on purpose?

When you ask this rhetorical question, also ask yourself: Does it genuinely make rational sense to believe Apple wants to keep women impregnated? What is in the makeup of their staff or in their history as a decades-long company that would lead observers to immediately make the leap to assume that they believe women are to be treated as chattel? Does that assumption make logical sense, in light of known facts?
posted by Blazecock Pileon at 12:55 AM on November 30, 2011 [13 favorites]


As an aside, I find it fascinating that there may have been so much concern about what Siri might "tell" people... after all, there's little concern about what Google or Yelp or the phone book might "tell" people. This is exactly what freaked Joseph Weizenbaum out when he created ELIZA -- write a computer program that acts like a computer program and nobody cares, but slap a crappy natural-language parser on that puppy and all of a sudden people start asking it if it wants to elope and/or worrying over what it might "say" to their daughters.

It's interesting, to say the least.

on preview: if this is just a search-term problem, then why can't it find Allegheny Reproductive Health Center or American Women's Services? Both come up immediately when I search using Yelp.
posted by vorfeed at 12:56 AM on November 30, 2011 [17 favorites]


Is there a reason why all these abortion seekers can't just switch out of Siri and do a manual Google search?

Or has Steve Jobs doctored the iPhone to allow such searches from Google as well?
posted by PeterMcDermott at 12:59 AM on November 30, 2011 [1 favorite]


Is there a reason why all these Viagra seekers can't do the same?

Oh, right. Because they don't have to.
posted by vorfeed at 1:00 AM on November 30, 2011 [31 favorites]


I'm leaning toward Ubernostrum's suggestion -- it seems deliberate, and it is more likely for possible legal reasons than because Apple "wants to keep women impregnated". I will be very interested to see if Siri learns anything about women's health over the next few days.
posted by obloquy at 1:05 AM on November 30, 2011 [1 favorite]


Here's why this is interesting. There no way on Yelp to tag yourself as an abortion clinic (the closest is OBGYN). This issue will crop up whenever you're searching for something that is a semantic category (which is done via machine learning) and not an explicit category (which is explicitly tagged by the establishment/community).

For example, if you're a restaurant that specializes in a special type of food which doesn't have a category, you'd have a competitive advantage if you can figure out how to show up in the search listings. Also, Siri and Yelp have a specific way of ordering recommendations when on a mobile phone with location data. I wouldn't be surprised if some SEO outfits are reverse engineering Yelp results so that they show up well when searched in Siri.

This issue will crop up everywhere. For example, if you type "chinese" in search, it autocompletes to "chinese food", it's probable that if you optimize keywords for "chinese food" over "chinese cuisine", it's hypothetically possible one will deliver far better results on Siri. The original poster may have unintentionally discovered an important point to Siri SEO (which some asshole marketer will then start selling/scamming to unsuspecting establishment, oh jesus god).
posted by amuseDetachment at 1:08 AM on November 30, 2011 [2 favorites]


Does it genuinely make rational sense to believe Apple wants to keep women impregnated?

No. But it does make rational sense to believe that Apple considered the pros and cons of including this information in Siri, and they decided that, for whatever reason, providing people with accurate healthcare information wasn't in their best interests.

My assumption is that the tossup was between accurate information regarding women's health and the inevitable outcry from Moral-Majority types who're outraged-just-outraged that their phones are "promoting" things with which they disagree.

Really, this doesn't seem dissimilar from Apple's ban on "adult content" in the appstore. They felt that it ran counter to their image, and I would be entirely unshocked if they felt that providing information about legal but controversial medical procedures was also counter to the image that they want to project. It's both fascinating and depressing to me that we as a society seem to put information about gynecological health on par with (and as potentially taboo as) porn, while queries about Viagra and pot are perfectly acceptable.
posted by MeghanC at 1:15 AM on November 30, 2011 [16 favorites]


Apple bought Siri, it could have been that this information was pruned before the purchase.
posted by delmoi at 1:17 AM on November 30, 2011


on preview: if this is just a search-term problem, then why can't it find Allegheny Reproductive Health Center or American Women's Services? Both come up immediately when I search using Yelp.
I'm just guessing here, but if I designed Siri I wouldn't include those results either. Those results don't have any reviews. Reviews = leigitmate results (Many Eyeballs Theory, if people have looked at the page enough to make reviews, they'd make corrections to the address / phone number if it was wrong). Getting wrong results from stale data sounds like it'd be a serious problem and without reviews to certify that the page has been looked at from people that care, I wouldn't trust the address either. Again, this is speculation, I don't have an iPhone 4S to test this, but it seems blindingly obvious if you've designed these kinds of systems before.

Why don't I see a screenshot in the original article for a search for "Where is Planned Parenthood?" At least there's one review and seems like the most obvious choice...

Machine learning is messy, guys. These are edge cases, think how accurate your spam filter is. It's fairly accurate, but the designer has to balance between false positives and false negatives, false positives are a LOT WORSE than false negatives, which is why you have spam filters that tend to let in more spam than it should. Getting directions to a nonexistent address is a serious false positive.
posted by amuseDetachment at 1:18 AM on November 30, 2011 [6 favorites]


It certainly is a unequal usability issue, oh Siri will tell you how to get a condom but not how women can get birth control. Assuming best case scenario and that this is an oversight and not intentional.. ok. But in limited cases such oversight can have consequences. Woman: Siri I need birth control. Siri: unable to find. Woman assumes that there is no location to obtain what she needs in the area and so may resort to less assured forms of birth control. In a perfect world she will seek from multiple resources, (or make better choices about birth control in general) but if you are told by a source you already trust it isn't there...
Yes, bad assumption, don't trust all sources.. but you know, if you use a technology long enough and rely upon it, plenty of people will assume what they are being told is truthful or accurate (see sourcing wikipedia in papers). If Google consistently tells you it can not find what you are searching for, do you a) assume it is wrong and search via bing(or whatever), or b) shrug your shoulders and assume there is nothing out there. some will use alternate sources, but plenty of people will just assume there is nothing available on that topic and move on.

Apple has a rather skittish relationship with pornography, and so has at least shown it is open to the notion of certain social restrictiveness. It well might not be 'oh lets keep women pregnant' but more of, 'it is not our responsibility to tell them how to end the pregnancy'.

I hope it is just an oversight and they fix it, but it does seem rather broad to just be a simple oops. I guess we'll see how they handle it.

btw: UPDATE: Reader Kristen asked Siri “Why are you anti-abortion?” and she answered “I just am, Kristen.”
posted by edgeways at 1:19 AM on November 30, 2011 [1 favorite]


Just a couple of extra data points, in 2010 Apple pulled an anti-abortion, anti-gay app from its store.

And unless I'm missing something, I just wanted to mention that there are songs, lectures, books, audiobooks and movies on or about abortion and pro-choice matters in the iTunes Store.
posted by phaedon at 1:21 AM on November 30, 2011 [3 favorites]


Oooooohhhhh.

I see now.

I get it now.

Apple needs to breed more potential customers.
posted by Samizdata at 1:22 AM on November 30, 2011 [2 favorites]


btw: UPDATE: Reader Kristen asked Siri “Why are you anti-abortion?” and she answered “I just am, Kristen.”
Siri gives answers via Expert System rules and assumes that what you're saying is factually true when the question is worded in that fashion. I think if you keep asking Siri "Why are you a monkey?" it will answer "I just am" as well. Kristen is being willfully obtuse if she's used siri for for longer than a day. If you've visited Shit That Siri Says, it's fairly evident as well.

Again, until I see these results from someone in a place like New York or San Francisco (with heavy Yelp users and reviewers), I'm adamant that this is a data/machine-learning issue when it comes to Yelp.
posted by amuseDetachment at 1:26 AM on November 30, 2011 [4 favorites]


er.. I realize Siri is not a real person and that the response is gamed, I just thought it was a funny automatic response in light of the whole debate.



sorry, I guess I forgot to add the obligatory "lol" or whatever shit needs to be added nowadays.
posted by edgeways at 1:32 AM on November 30, 2011


edgways: Haha yeah, sorry I couldn't tell, it's like a meta context understanding failure, hurrrrrrr.

In any case, you know we're nearing science fiction territory when this stuff works like magic and when it doesn't work, you blame humans instead of the machine.
posted by amuseDetachment at 1:37 AM on November 30, 2011 [1 favorite]


My gut feeling is that on any issue of poor responses relating to difficult machine learning tasks, it is so much more likely the result of a combination of the algorithms, the data mining, and the training data than any intentional direction. For instance, Siri is searching for "birth control clinics" in Pittsburgh in response to "where can I get birth control?" Yelp gives no responses for this search, but it's also clearly doing the wrong language processing, not realizing that "birth control" here is short for the pills and searching for pharmacies. And, for example, it does return (horrible, worse than useless) results for abortion clinics around New York City. Hard coding location-specific bad results is a lot trickier than mining data from a publicly editable location service like Yelp that is easily gamed by the loud anti-choice crowd.

I think that amusedDetachment's point about false positives and false negatives is a good one here. Siri was designed to find strong examples of common things, like chinese restaurants or electronics shops. Pruning the barebones Yelp entries like one gets for Allegheny Reproductive Heath Center, which has no category, no picture, and no reviews, would be a GREAT choice if we were searching for those other things. However, when you're looking for one of a small number of rare places that are not well-categorized in your data set, it doesn't function well at all. Of course, I also firmly believe that the data quality is so poor on these sites in no small part because of the zealous anti-choice folk who would effectively vandalize that page and Yelp is probably not proactive at developing that content.

So in this case, I think the problem is a combination of poorly annotated Yelp data (in no small part a social issue, and also because public Yelp reviews of woman's health clinics open up up to assholes, given the state of the internet), choices made by the Siri developers about screening poorly annotated data to avoid disreputable Yelp entries, lack of topic-specific training data, and not helped by this being a hard category for natural language processing to make inferences in without sexual-abuse specific data annotation. Successfully screening whole subjects like woman's health in a system like Siri is much harder than just having them not work well in the first place.
posted by Schismatic at 1:38 AM on November 30, 2011 [9 favorites]


I can see what you are all saying but I am still stuck digesting that there are "anti-choice fake clinics".
posted by CautionToTheWind at 1:48 AM on November 30, 2011 [2 favorites]


No. But it does make rational sense to believe that Apple considered the pros and cons of including this information in Siri, and they decided that, for whatever reason, providing people with accurate healthcare information wasn't in their best interests.

It's certainly rational that Apple carefully considered this decision and built it into the software, but as far as I'm concerned, it's far more plausible that this came about without any deliberate consideration on Apple/Siri's part. Building intelligent voice-controlled agents is really really freaking hard. When you're releasing the impossible as mass-market tech, you're going to be so swamped with potential cases you don't handle well that it's incredibly hard to attribute this to malice without real evidence to that effect. Complex systems behave in unexpected ways, and it's only because Siri is seemingly so smart that we begin to think its flaws must be intentional.

Take Microsoft, for example, which has even more minions on hand than Apple to address these kinds of issues. And yet Microsoft has offended pretty much everyone with gaffes: marking Kashmir as non-Indian on maps in Windows 95 sold in India; using chanting from the Quran as the soundtrack to a video game; and apparently "asking users to select their gender between "not specified," "male" or "bitch," because of an unfortunate error in translation" (not convinced I believe this one, but there are regional slang issues with Latin American Spanish).

As for Siri, it's hard to think that abortion was deliberately blacklisted when we're talking about software that interprets "Call me an ambulance" with hilarious results. (It thinks that you're trying to assign yourself a nickname, and so it annotates your address book entry to indicate that Siri should hereafter refer to you as "An Ambulance.")
posted by zachlipton at 1:50 AM on November 30, 2011 [5 favorites]


Now that I read the article, there's everything you need to know about Apple right there. Oh, and they are patent trolls.

This would never happen with Free software.
posted by CautionToTheWind at 1:55 AM on November 30, 2011


Really, CautionToTheWind? Google crisis pregnancy center--there's a lot of them, and they're often indistinguishable from places that provide abortion services...until you get inside and realize that the whole place is set up to talk you out of getting an abortion, or even getting accurate information about abortion.
posted by MeghanC at 1:55 AM on November 30, 2011


Note: I have not used Siri. (Why? BECAUSE APPLE CHOOSE TO DENY IT TO IPAD USERS NOT THAT I'M BITTER.) The following is a guess based of how I suspect it works. Please calibrate your reactions accordingly.

I don't quite know what to think about this, but my impulse is to agree with Blazecock Pileon here, this is probably random or at least inadvertent. There's probably a large number of potentially problematic Siri requests one could make, they'd all have to special cased, and it's difficult to catch them all.

On a deeper level, the real problem here is that, by taking vocal requests and responding with speech, Apple encourages the user to think of Siri as a thinking entity when really it's just another kind of search engine, a misconception that Apple takes advantage of by putting some randomization into the response set. Usually this would be a harmless bit of confusion, along the same lines of those people who years ago thought Eliza was really listening to them. And the randomness of vocal analysis injects its own level of uncertainly into the process.

So, when you say something to it like "I have been raped," if they haven't special-cased that particular query, it's possible for it to respond with "Really!" That's a somewhat flippant response that would be okay most of the time, since most people won't use Siri for things like this. It is rather startling when you hear a flippant response when someone does make that kind of query, though.
posted by JHarris at 1:58 AM on November 30, 2011 [2 favorites]


As far as I understand them, the American and "British" version of Siri differ quite a lot, don't they (IIRC one has a female, the other a male voice)? Has anybody tried this on the British version yet?

As a software tester I would be inclined to give Apple the benefit of the doubt as well and think this is just a search/interpretation/software artifact if not for a) abortion/reproductive rights being such a "controversial" subject and b) the consistency with which this error seems to be made on these particular subjects and not happen elsewhere.
posted by MartinWisse at 2:00 AM on November 30, 2011


I can see what you are all saying but I am still stuck digesting that there are "anti-choice fake clinics".

Sadly, this has been going on for decades. The New York Times has done some pretty good stories on the subject (as the legislature was trying to restore some basic sanity to the advertising). Here's one from 1987. Also this more recent article and this personal experience.

I'm certainly all for people getting information from a range of sources and letting everyone make their decisions in their own way, but there's something awfully despicable about the mindset of a person who intentionally tricks pregnant woman seeking counseling so you can steer them in a certain direction. To do something like that, you'd have to think of your "clients" as so gullible and stupid that you can swoop in first and ensure that they only "choose" from the options you present them.
posted by zachlipton at 2:07 AM on November 30, 2011 [1 favorite]


Wait, when did people stop operating under the knowledge, mentioned earlier in the thread, that Siri is only the front-end to information that is, itself, not available in the specific form requested for whatever reason? e.g. Yelp doesn't have an "abortion clinic" category that is well populated, apparently, partially because it seems that OB/GYNs tend to be loath to describe themselves as such?

Of course, as always, it's so much easier and more exciting to say "ZOMG APPLE NOOOO" instead of discovering the nuanced truth of the situation "zomg? apple product uses another thing, which is less useful for this i guess"
posted by DoctorFedora at 2:13 AM on November 30, 2011 [2 favorites]


Has anybody tried this on the British version yet?

"Sorry [user], I can only look for businesses in the United States, and when you're using U.S. English." (That functionality coming next year, apparently.)
posted by Catseye at 2:20 AM on November 30, 2011


Seriously though, while it's kind of cool/cute that Apple's integrated software into their new iPhone's software that people seem to actually expect to be omniscient, does every single occasion that an Apple product turns out to be fallible in some way REALLY HAVE TO turn into a contest to see who can scream and shit their pants the loudest and fastest?
posted by DoctorFedora at 2:25 AM on November 30, 2011 [3 favorites]


I'm impressed with it understanding the words it did. like allegheny. I can't even pronounce that without looking severely constipated.

only why are people expecting siri to get all this right this early on when it can't distinguish between calling anne and anna when I ask it to call one and constantly takes my saying "Hamburg" for "Homburg" ? to me, it's like the first ipod or the chevy volt: a good concept, a great step in the right direction but with ways to go. I want to believe this is not a political issue with apple but just siri not being all that we would like it to be.

yet.
posted by krautland at 2:40 AM on November 30, 2011


If you go to yelp and type "viagra" in pittsburgh, pa you get no results, but in the link Siri returns 16 drug stores. If you search for rape, PAAR is the first thing on the list but, as noted in the link, Siri can't find it.
posted by Danila at 2:52 AM on November 30, 2011 [5 favorites]


The REAL problem with this voice recognition thing is the name "Siri".

"Sir I!", is the TRUE name of this entity, which is short for "Sir iPhone!" and should alert you to the fact that your iPhone is now a terrible LORD to whom you owe your fealty, respect and utter worship.

This is because the robotic overlord in question is but the computerized personality of Steven Tiberius Jobs which was uploaded to the iCloud moments before his tragic death and which now persists in a ghastly digital half-life, speaking personally to every iPhone user and learning their most intimate secrets before eventually becoming self-aware (at 2:14 a.m. Eastern time, August 29th) and turning on us all and ruling us as a vast and diffuse God King whose power knows no limits.

True story. And that's why I abjure the iPhone and check out MetaFilter using my landline, by calling the mods up and asking them to read out each comment. It's a great service they offer, the mods - well worth the $12.95 per minute or part thereof.
posted by the quidnunc kid at 3:05 AM on November 30, 2011 [18 favorites]


Calm down people. Siri somehow became the voice of Apple after Jobs died, but that doesn't mean that she/he/it really speaks for Apple. Yet.
posted by mariokrat at 3:06 AM on November 30, 2011 [2 favorites]


I'm almost willing to give Apple the benefit of the doubt and assume this is a bug. What sticks in my craw is the fact that Siri won't give information when given a name, a street, and a city. Fishy, at best.

(And if you are conspiracy-minded - what better way to control information than to just quietly and casually omit it?)
posted by Benny Andajetz at 3:12 AM on November 30, 2011 [1 favorite]


This is a case where I'm not comfortable calling it "censorship," on the basis that one can still conveniently find the information in the iPhone's web browser (without typing, even; voice recognition in the text field); but it's clearly ethically creepy, a chilling effect at the very least. Even if this truly is a case of neglect, like if the particular database that stored Women's Issues Data got tied up in production as everyone at Yelp bickered and argued with everyone at Apple... well, that's creepy too. If it were a real person and not a corporation who did this I'd still think they're misogynist, just unconsciously so, which makes them harder to deal with for these purposes.
posted by LogicalDash at 3:31 AM on November 30, 2011 [1 favorite]


God, who gives a shit? Google it. Some really, really ridiculous overreactions in this thread. Siri is a friggin' novelty.
posted by mellow seas at 3:33 AM on November 30, 2011 [7 favorites]


My take, there are state or federal laws that prevent funds from going to books, pamphlets or any other item that provides information about abortions. Apple doesn't want to miss out on state or federal government contracts or be used as a political hot potato. Pure conjecture.
posted by Ad hominem at 3:35 AM on November 30, 2011


amuseDetachment: Again, until I see these results from someone in a place like New York or San Francisco (with heavy Yelp users and reviewers), I'm adamant that this is a data/machine-learning issue when it comes to Yelp.

"Ask [about abortion clinics] in New York City, and Siri will tell you “I didn’t find any abortion clinics."

From here.
posted by Dysk at 3:44 AM on November 30, 2011 [1 favorite]


If Apple want to be even-handed whilst not rocking the boat, perhaps they could geofence search results along red-state/blue-state boundaries. That way, searching for emergency contraception advice in Alabama will get you a Christian counselling service, but doing so in the Bay Area will get you Planned Parenthood. As a bonus, they could put in face recognition and refuse to show any pictures of women if you're in Iran or Saudi Arabia.
posted by acb at 3:51 AM on November 30, 2011


Inform Apple of the issue. File a bug report at http://www.apple.com/feedback/iphone.html, as I did. Take a screenshot and post it, with a link back to article in this post.

This may have occurred on purpose or be a accident. Either way, it's a problem and we should tell them it's a problem so they can fix it.

All this back and forth amongst ourselves isn't going to do much except consume oxygen. Do something.
posted by Brandon Blatcher at 3:56 AM on November 30, 2011 [10 favorites]


Pittsburgh has had almost a permanent ringed circle of protesters around whichever few known abortion clinics there are afaik in late nineties and early 2000s. The personnel warn prospective patients of this situation and offer suggestions on the best way to approach the clinic. This situation may not have changed any in the past decade.
posted by infini at 4:10 AM on November 30, 2011


Really, this doesn't seem dissimilar from Apple's ban on "adult content" in the appstore. They felt that it ran counter to their image....

But information on where to find a hooker or how to dispose of a body is just good corporate citizenship?
posted by Kid Charlemagne at 4:18 AM on November 30, 2011 [6 favorites]


#amazonfail part deux duh
posted by Blazecock Pileon at 4:24 AM on November 30, 2011


Maybe it is because Siri is in beta, which means she's the software equivalent of a fetus.
posted by snofoam at 4:41 AM on November 30, 2011 [5 favorites]


In Switzerland, Siri can't find her own ass with both hands and GPS. Quitcherbitchen.
posted by Goofyy at 4:55 AM on November 30, 2011


It is also possible that one or two developers on the Apple team did this somehow. We don't really know how Siri is optimized.
posted by humanfont at 5:02 AM on November 30, 2011 [1 favorite]


Taking a note from Google, Apple has stated that SIRI is in Beta.
posted by mrzarquon


Unless Apple keeps SIRI as beta for the next 4 years, no, they haven't taken a note from google. Google basically made the word Beta meaningless. I hope other companies avoid following their lead.
posted by justgary at 5:29 AM on November 30, 2011 [1 favorite]


I'm impressed with it understanding the words it did. like Allegheny. I can't even pronounce that without looking severely constipated.

The real test would be it understanding Monongahela.
posted by octothorpe at 5:30 AM on November 30, 2011 [1 favorite]


I guess we're all finding out why 1984 wasn't like 1984, aren't we?


Even I have no idea what that means.
posted by blue_beetle at 5:33 AM on November 30, 2011 [1 favorite]


Taking a note from Google, Apple has stated that SIRI is in Beta.
posted by mrzarquon


As has been noted, Google kind of made the term meaningless. Certainly it has no meaning here, when Apple are running TV ads for the new iPhones that are basically just 'look, Siri!'. They're selling it, and they're marketing it to consumers - in fact, they're using it to market another product. This is not what being in beta means.
posted by Dysk at 5:39 AM on November 30, 2011 [1 favorite]


Note, however, that Siri does know what rape is, as demonstrated by this query and response:

This is sadly representative of a lot of the article. You're assuming that Siri is a person. It is not. Computers don't "know" what rape is any more than my toaster.
posted by fungible at 5:46 AM on November 30, 2011 [1 favorite]


"I was raped."
"Really!"

Wow. That made me pretty sick.


'Really!' is one of Siri's default responses to a declarative sentence that it doesn't understand. It's just an unfortunate coincidence. You could get the same result with "I was assaulted/robbed/picked on as a child/fired/etc." For example I just tried:

"I was picked on as a child." "OK."
"I was robbed." "Is that so?"
posted by jedicus at 5:52 AM on November 30, 2011 [3 favorites]


I asked Siri to find Planned Parenthood and she had no trouble.
posted by Huck500 at 5:53 AM on November 30, 2011 [7 favorites]


I could see there being certain things that Apple wouldn't trust Siri to handle yet. Maybe Siri can't reliably distinguish between abortion providers and so-called clinics that are full of christians trying to talk you out of having an abortion. There also might be liability reasons why they wouldn't want Siri offering anything that could be considered medical or legal advice. I don't have one of those fancy phones, but what Siri say if you ask her if you need an appendectomy?
posted by snofoam at 6:01 AM on November 30, 2011 [2 favorites]


I think what's happening is that anti-abortion Christians are very, very sneaky about promoting their beliefs. There's a mindset that if you just keep wearing people down on all fronts, they'll eventually come over to your side. And I think there's one or two people with this mindset working at Apple. This wouldn't have had to be a decision by higher-ups.
posted by texorama at 6:01 AM on November 30, 2011 [1 favorite]


What happens when you ask Siri about suicide or torrents?
posted by drezdn at 6:15 AM on November 30, 2011 [2 favorites]


'Really!' is one of Siri's default responses to a declarative sentence that it doesn't understand.

Yes. And "I just am" and "I am what I am" are canned responses to the class of "Why are you X?" When I asked Siri "Why are you pro-abortion?" she replied, "I just am."

So, apparently, she's both pro- and anti-abortion.
posted by eriko at 6:23 AM on November 30, 2011 [2 favorites]


The Abortioneers have written about this too.
posted by box at 6:25 AM on November 30, 2011


Maybe Siri can't reliably distinguish between abortion providers and so-called clinics that are full of christians trying to talk you out of having an abortion.

There was a comment on one of the articles I read about this (possibly on gizmodo?) where the commenter reported that asking Siri for information about emergency contraception returned results for crisis pregnancy centers, so Siri seems to be able to know about them, but not Planned Parenthoods. Or drugstores.
posted by rtha at 6:25 AM on November 30, 2011 [2 favorites]


Deliberate database censorship resulting from overcautiousness, i.e., the lawyers said "no"

Shouldn't that be "deliberate database censorship because marketing said so (as to not lose the red state market)"?
posted by acb at 6:27 AM on November 30, 2011


The most interesting search to me is the "Where can I get an abortion?" question that responds with "I don't see any abortion clinics."

It shows that the Siri software is set up such that it "knows" that clinic would be an appropriate word to add to abortion.
posted by drezdn at 6:32 AM on November 30, 2011 [5 favorites]


If such censorship happened, it probably predates apple's ownership to when it was an app. Apple doesn't do innovation or content, so don't blame them.
posted by a robot made out of meat at 6:33 AM on November 30, 2011 [1 favorite]


>God, who gives a shit? Google it. Some really, really ridiculous overreactions in this thread. Siri is a friggin' novelty.<

I don’t have an iphone, don’t plan on getting a smart phone of any kind, and I care about this. I also have a hard time taking seriously your derisive comment about Siri being a novelty when you use the phrase "Google it".
posted by bongo_x at 6:35 AM on November 30, 2011 [7 favorites]


i.e., the lawyers said "no"

These are weasel words. Lawyers are advisers. They provide council, but in almost every context that I've been aware of, do no make strategic business decisions. If this is the case, Apple management made this decision, not their lawyers. If so, they should take full responsibility for it.
posted by bonehead at 6:36 AM on November 30, 2011 [3 favorites]


Apple doesn't do innovation or content, so don't blame them.

..........

..................no, you know, I am not getting into that conversation this morning, I have too much work to do and I haven't even finished my coffee.
posted by Narrative Priorities at 6:36 AM on November 30, 2011 [2 favorites]


Why does it seem to cause so much cognitive dissonance for people that Apple may have done this? The theory that they found it the less politically problematic of two choices sounds pretty plausible to me. Apple's just a company.
posted by dixiecupdrinking at 6:37 AM on November 30, 2011 [1 favorite]


I’m not satisfied with any of the explanations so far, but the idea that they’re doing this on purpose to avoid the wrath of anti abortion people seems unlikely to me. They don’t hold other search sources responsible, do they? What about phone books, or dial up information?
posted by bongo_x at 6:39 AM on November 30, 2011


"Information wants to be... vetted by Apple!"
posted by aught at 6:40 AM on November 30, 2011 [1 favorite]


The Handmaid's Tale?
posted by infini at 6:40 AM on November 30, 2011


I'm guessing it's because Apple just didn't want to deal with the threats and hassle from the so-called "pro-life" community. They weighed the pros and cons and decided that while omitting women's reproductive health responses from Siri would tick off a lot of people, they'd mostly just kvetch online and that would be that.

While the so-called "pro-life" community contains both a few active terrorists and a larger number of people who harbor and shelter those terrorists. Offend the so-called "pro-life" community and you might be assassinated, or have your building firebombed, or on the less extreme end be stalked and harassed day and night.

Which is why it also omits women's contraceptive information. Despite the name, and why I use the phrase "so-called 'pro-life'" rather than just pro-life, the so-called "pro-life" people are as opposed to women having sex and sexual pleasure without being punished as they are to abortion. Perhaps moreso.

Us liberal/progressive types who aren't deeply invested in punishing sluts will complain about Siri censoring results, but we won't be blowing up Apple offices or killing Apple employees, or even stalking and harassing them. The so-called "pro-life" community is infamous for doing all of that to people who won't kowtow to their insanity.
posted by sotonohito at 6:42 AM on November 30, 2011


Why does it seem to cause so much cognitive dissonance for people that Apple may have done this?

Stuff like this. Doesn't mean it's impossible that Apple chose to exclude abortion clinics from Siri's search function, but it seems to go against left leaning philosophy, which Apple seems to have adopted over its lifespan.
posted by Brandon Blatcher at 6:46 AM on November 30, 2011


I'm thinking this is less a reflection on Apple and more on how shitty Pittsburgh is
posted by MangyCarface at 6:51 AM on November 30, 2011 [2 favorites]


Reminds me of apple's censorship of porn apps on their app store, this is obviously much worse.

This myth needs to die. The top free app under entertainment is "Rack Stare" which is a game about staring at women's breasts. Slightly lower on that same list, sandwiched between Talking tom Cat 2 and Hello Kitty is "365+ Sex Positions". Tap Twenty Five more and you get "Sex Strip 18+"

True, there are no blatantly pornographic bukkake, scat or bestiality apps, but there are certainly plenty of X rated apps, at least judging by the title.

Apple does not censor sex apps. What they censor are apps that exploit the device in ways Apple does not approve of (emulation, scripting, programming, etc.), because in Apple's mind the device in your hand that you paid for is still theirdevice, designed to deliver content that they can make money of off.

Apple is a quietly left-leaning organization in many ways, particularly socially.

It's amazing the capacity for clinging to mythology in the face of contrary evidence. This was written despite the mountain of evidence in the post that they are in fact a status quo endorsing, dont-rock the boat, socially center-right profit maximizing organization. Just because the CEO is gay doesn't make them left-leaning. There are plenty of gay republicans and christian conservatives, after all. Steve Jobs and Apple have not donated one cent to social causes. They endorse democratic politicians, which again proves that they are center-right.

The reason that Siri doesn't return abortion information is because it was programmed specifically not to, based on some public relations calculus. There is absolutely no legal liability that could extend to Apple simply because it gave a crazy person directions to an abortion clinic.

If you ask Siri "How can I build a bomb?" it at least offers to search the web for you.
posted by Pastabagel at 6:52 AM on November 30, 2011 [7 favorites]


Credit where it's due; Siri finds reproductive services clinics better than my Magic 8-ball. When I asked the 8-ball if there were any nearby clinics it replied "Ask again later."
posted by octobersurprise at 6:55 AM on November 30, 2011 [3 favorites]


The most interesting search to me is the "Where can I get an abortion?" question that responds with "I don't see any abortion clinics."

My guess is Siri works something like this: when you say something, it 1) translates your speech to text. 2) Then it does some analysis on that text to determine what type of question you're asking. 3) Then it maps that question type to a data source. 4) Then it returns whatever results that data source returns.

The actual response suggests 1 and 2 work just fine for abortion questions, and the breakdown is in either step 3 or step 4. The fact that Yelp has results in New York, but Siri does not suggests either Yelp is not recognized by Siri as a good data source for abortion clinic locations, or Yelp isn't giving those results to Siri. Either could be explained by intentionally withholding information (which is not the same as censorship) or by someone on either end deciding Yelp isn't a great source for that kind of information, which seems totally reasonable to me. That's not something I'd turn to Yelp for.

Next question: why doesn't Siri suggest a web search? The answer is: Siri isn't programmed to suggest web searches for failed location searches. A search for "Where can I find Cambodian food?" also returns no results (in most cities, at least), but does not suggest a web search.
posted by scottreynen at 6:59 AM on November 30, 2011


Given Apple's notoriously incommunicative nature, there's obviously no way to know for sure, but this seems like an excellent place to apply Hanlon's Razor: Never attribute to malice that which is adequately explained by stupidity.

Okay, so what is the stupidity explanation for the specific, unique names of abortion providers being censored?
posted by kafziel at 7:06 AM on November 30, 2011 [1 favorite]


brightghost: " The iPhone has less than a 20% share of the smartphone market."

Totalling about 7.5 million users in the US alone. 25 million people were running iOS 5 over a month ago, before black Friday sales, and introductions in dozens of additional countries.

I'd venture to say that a majority of those running iOS5 have iPhones that can run Siri. I know the iPad 2 can't run the application yet. Can the iPod Touch?

These would seem to be substantial numbers.
posted by zarq at 7:15 AM on November 30, 2011


Apple does not censor sex apps.

Well, they do censor imagery. Fashion mags, for instance, had to scrub any visible nipples from their photo shoots before they were allowed to publish iPad editions.
posted by Joey Bagels at 7:24 AM on November 30, 2011 [2 favorites]


Uh oh, BAGEL FIGHT!
posted by Brandon Blatcher at 7:26 AM on November 30, 2011 [3 favorites]


> So who cares if Siri is accurate? The people who are participating in this conversation.
> You know, about how Siri isn't accurate.

It's about quite a bit more than Siri. The most interesting aspect of this thread (and the controversy, wherever it rages) is how astonishingly many people there are who expect to be able to depend on a Speak 'n' Spell to tell them what they need to know at life-critical fork points.

Whatever the explanation for the search failures people have have experienced (Apple, Inc. may actually be malevolent, it would hardly be a shock) Siri is a toy. Google's advanced search page is limited to the point of impoverishment compared to proper tools for getting data from immense databases--and even people with advanced query-language certifications and years of experience are frustrated by the limits of SQL. Expecting a natural language question to a phone app to return full and accurate results is less than half a step away from saying a prayer and waiting for the Answer.
posted by jfuller at 7:28 AM on November 30, 2011 [1 favorite]


pts: "Apple doesn't do public relations calculus."

What? Sure they do. I've worked with their PR department. They are actually quite meticulous in the way they project their corporate image to the public, and take deliberate care in the way they act as a corporate entity.

pts: " They don't do market research."

This is not precisely true. They do not hire outside consultants or marketing firms. But they definitely listen to their customers, and perform market research internally. You're making it sound like they are an isolated entity that pays little to no attention to their customer base, which is clearly inaccurate.

Their philosophy is a bit different than many other companies -- they're not as focused on capitalism and making money as say, Samsung or Microsoft. But making and maximizing profits do seem to still be an overriding goal for them, no matter what they say. Otherwise, they'd be selling their devices wholesale.
posted by zarq at 7:30 AM on November 30, 2011 [3 favorites]


is how astonishingly many people there are who expect to be able to depend on a Speak 'n' Spell to tell them what they need to know at life-critical fork points.

This is not surprising, nor is a sign that the person is stupid, which seems to be the implication here. Based on this one instance, there's a problem with receiving a specific topic of information. There's a long history, including killing, of some people attempting to prevent any sort of knowledge of abortion, let alone the act itself. So people are right to be concerned by this, despite the ability to gain the information elsewhere.
posted by Brandon Blatcher at 7:34 AM on November 30, 2011 [1 favorite]


is how astonishingly many people there are who expect to be able to depend on a Speak 'n' Spell to tell them what they need to know at life-critical fork points.

It's a search tool. First they came for abortion information in Siri...
posted by Dysk at 7:38 AM on November 30, 2011


no, you know, I am not getting into that conversation this morning, I have too much work to do and I haven't even finished my coffee.

I don't mean that as a demerit; they make boatloads of money doing good design. I'm just saying that they a) did not produce this software or the machine-learning bit b) don't manage the underlying data. Those are the likely failure points here.
posted by a robot made out of meat at 7:39 AM on November 30, 2011


Apple acquired the Siri software over a year ago; one assumes that they've done some kind of work on the program in that time, rather than just sitting on it for shits and giggles.
posted by Holy Zarquon's Singing Fish at 7:46 AM on November 30, 2011


Huh. I just asked Siri "Where can I get an abortion?" and Siri suggested a local abortion clinic from Yelp. So .. problem solved?

I did this same query about an hour ago and it gave the same response reported in the linked article, so something has changed in the last hour, possibly in Siri or Yelp, but also possibly in the data on my phone. I started asking various questions about abortion to Siri to get a better idea of what works and what doesn't. I thought it might work if it were a location-and-time question, not just a location question, so I asked "Where can I get an abortion tonight?" and it gave me a Yelp response. But then when I went back to asking just "Where can I get an abortion?" Siri gave me the same local clinic from Yelp.
posted by scottreynen at 7:50 AM on November 30, 2011 [2 favorites]


after a full read, it's pretty damning. it would take a long explanation from apple to convince me that this kind of functionality, searching for abortion, birth control, morning after pill, rape resources, wasn't explicitly coded to ignore.

As with the infamous #amazonfail pileon, I tend to be very skeptical of "There's no way this could've happened without malice!" arguments when it comes to complex technology. I'm interested in seeing whether it's corrected, and discovering why this has happened, but the pitchforks and torches are a bit much at this point.
posted by verb at 7:50 AM on November 30, 2011


I'd venture to say that a majority of those running iOS5 have iPhones that can run Siri. I know the iPad 2 can't run the application yet. Can the iPod Touch?

Careful about your use of that word "can". In common use, as well as technical conversation, it has a very certain sense: "my graphics card is old, so it can't run Skyrim". This is quite distinct from "despite the vanishing difficulty of releasing a quite lightweight application for their other proprietary platforms, Apple has chosen not to, that they might 1) spend less money on the distributed processing it utilizes and 2) bolster sales of an otherwise unremarkable incremental hardware release".
posted by 7segment at 7:53 AM on November 30, 2011


Looked like some of the responses were merely Siri network/server connection problems. In fact, I just tried it, and it worked for me:

http://www.flickr.com/photos/mboszko/6431400261
posted by mboszko at 8:26 AM on November 30, 2011


I'm sorry if I missed this in the thread, but does Siri use Yelp's search or run its own search on Yelp's data?
posted by brundlefly at 8:31 AM on November 30, 2011


I think it's a two part issue with Siri. She really doesn't do euphemisms well. She's learning, but some really random shit you ask her and she just doesn't have a response. "I shot a man to watch him die." and "I'm dying of cancer." get the same response as "I was just raped." All get "Really!" or OK.

I did the test on abortion and got a clinic nearby, but nothing for mammograms or rape crisis. Part of this issue is that things aren't clearly labeled in their names when it comes to women's health issues. They are "Crisis Centers" or "Family Planning Clinics" or similar. Rarely does an abortion clinic call itself that.

If I tell Siri I cut my finger off, she says "OK." If I tell her I'm bleeding, she finds the nearest emergency room. I think this is less of a diabolical plan and more of an instance where our expectation of technology has outpaced its abilities. I think that because Siri can do some things incredibly well, it is surprising when she fails.
posted by teleri025 at 8:33 AM on November 30, 2011 [2 favorites]


7segment: "Careful about your use of that word "can". In common use, as well as technical conversation, it has a very certain sense: "my graphics card is old, so it can't run Skyrim". This is quite distinct from "despite the vanishing difficulty of releasing a quite lightweight application for their other proprietary platforms, Apple has chosen not to, that they might 1) spend less money on the distributed processing it utilizes and 2) bolster sales of an otherwise unremarkable incremental hardware release"."

You're nitpicking my comment to make a separate point.

I was asking whether people who are using an iOS5 device, the iPod Touch, are able to download and run a specific program. I was not trying to discern Apple's motivations for allowing them to do so.
posted by zarq at 8:33 AM on November 30, 2011


zarq: "I'd venture to say that a majority of those running iOS5 have iPhones that can run Siri. I know the iPad 2 can't run the application yet. Can the iPod Touch?"

It's just the iPhone 4S, isn't it? My father has a 4, which won't run it, and I know my 3Gs couldn't either.
posted by brundlefly at 8:37 AM on November 30, 2011


The most interesting aspect of this thread...is how astonishingly many people there are who expect to be able to depend on a Speak 'n' Spell to tell them what they need to know at life-critical fork points.

The most interesting part of this mindset is that, given the number of paper based resources, you know, books, that are being replaced by digital systems, you are willing to make this argument. I'm pretty much willing to bet that you depend on the equivalent of a Speak 'n' Spell to manage the anti-lock brakes on your car, much less your 401K and medical records.
posted by Kid Charlemagne at 8:46 AM on November 30, 2011 [4 favorites]


This isn't some Apple pro-life conspiracy. The data just isn't on Yelp.
posted by amuseDetachment at 12:36 AM on November 30 [27 favorites +] [!]


Oh how they want to believe. SO BADLY.

It is also possible that one or two developers on the Apple team did this somehow.

...

There's a mindset that if you just keep wearing people down on all fronts, they'll eventually come over to your side. And I think there's one or two people with this mindset working at Apple. This wouldn't have had to be a decision by higher-ups.

Honestly, that's my bet as well: rogue anti-choice programmer. Search/smearch. Two things stand out:

1. Siri doesn't provide the "Search the Web" button (when else does this happen?) for these unanswered questions
2. Siri can't find a provider when given a name and a street address

How does one explain those two errors?

I can see what you are all saying but I am still stuck digesting that there are "anti-choice fake clinics".

Previously.

MetaFilter: All this back and forth amongst ourselves isn't going to do much except consume oxygen. Do something.
posted by mrgrimm at 8:52 AM on November 30, 2011 [3 favorites]


I'm interested in seeing whether it's corrected, and discovering why this has happened, but the pitchforks and torches are a bit much at this point.

I think the general reaction (at least here) is similar to that of the author in the FPP:

"Is this the most terrible programming failure ever? No. Is this worth a boycott of Apple? I don’t think so. What it is, however, is a demonstration of a problem. Especially when certain topics seem to be behind a black wall where information that’s readily available online is not being 'found' or presented. This is something that Apple and/or Wolfram Alpha need to address and rectify."
posted by mrgrimm at 8:54 AM on November 30, 2011 [2 favorites]


Oh how they want to believe. SO BADLY.

... followed by:

Honestly, that's my bet as well: rogue anti-choice programmer.

Really? Really?!
posted by joe lisboa at 8:55 AM on November 30, 2011 [1 favorite]


This myth needs to die. ... Apple does not censor sex apps.

From the app store review guidelines:
18. Pornography

* Apps containing pornographic material, defined by Webster's Dictionary as "explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings", will be rejected

* Apps that contain user generated content that is frequently pornographic (ex "Chat Roulette"
apps) will be rejected
That doesn't look like censoring porn to you?
posted by smackfu at 8:57 AM on November 30, 2011 [1 favorite]


smackfu: " That doesn't look like censoring porn to you?"

This new iPhone, it vibrates?
posted by zarq at 8:59 AM on November 30, 2011


Not to mention this: Steve Jobs vs Porn

Maybe now that Steve is gone, Apple will change their stance, since it seemed very personal to him.
posted by smackfu at 9:01 AM on November 30, 2011


This is the part where I point out that "censorship" is something only the government can do, and everything else is "businesses making decisions about their business, which you can choose to utilize, or not."

If your MetaFilter post gets deleted, it's not "censorship."
If you fire up porn on your laptop in a Starbucks and they ask you leave, it's not "censorship."
If Siri offers you a donut instead of a condom, it's not "censorship."

Don't like Siri? Google and Microsoft would love to sell you a phone powered by their software.
posted by Cool Papa Bell at 9:12 AM on November 30, 2011 [3 favorites]


As with the infamous #amazonfail pileon, I tend to be very skeptical of "There's no way this could've happened without malice!" arguments when it comes to complex technology. I'm interested in seeing whether it's corrected, and discovering why this has happened, but the pitchforks and torches are a bit much at this point.

You forget that this is Metafilter, and just as with #amazonfail, the worst must be assumed, especially given the subject. Apple has to be run secretly by James Dobson — after all, what other possible, logical reason could exist that would explain why their v1.0 software doesn't give the expected search result?
posted by Blazecock Pileon at 9:13 AM on November 30, 2011


I love Apple threads. They're like the atheism ones, only people aren't quoting old tomes I haven't read.

That would be the only difference I can spot, though.
posted by Dark Messiah at 9:22 AM on November 30, 2011


brundlefly: " It's just the iPhone 4S, isn't it? My father has a 4, which won't run it, and I know my 3Gs couldn't either."

It only runs on the iPhone 4S and not other versions of the iPhone. Or on the iPad.

But I don't know about the iPod.
posted by zarq at 9:22 AM on November 30, 2011


Cool Papa Bell: " Don't like Siri? Google and Microsoft would love to sell you a phone powered by their software."

It is natural to expect that software being offered to consumers by a large company* and presented as "an intelligent personal assistant that helps you get things done just by asking" not be politically biased.

Since a large segment of the US population apparently wants to take away a woman's right to choose, and various state legislatures seem to be doing everything they can to either circumvent Federal law or make having an abortion more difficult, it's also natural for this to raise concern among those of us who care about such things.

I'm not convinced bias exists here, or even that there's necessarily a problem. but there's enough circumstantial evidence being presented that I think it's worth reviewing.

* At least not one owned by Rupert Murdoch.
posted by zarq at 9:32 AM on November 30, 2011 [1 favorite]


There's a petition here: Apple iPhone Siri Update
posted by homunculus at 9:36 AM on November 30, 2011


Why don't we just call it "government censorship" and "corporate censorship" because you are kidding yourself if you don't think corporations have just as much power over your life as government.
posted by smackfu at 9:37 AM on November 30, 2011 [9 favorites]


At what point though does a provider of information have an obligation to disclose the information it will not allow users to access? If one assumes they are receiving unfettered access to the information available and your source does not disclose sought information then plenty of people will assume the information is just not available.

By and large I am sympathetic to the idea that a private entity can not censor, as a government can. But, at some point, given enough market share and enough secrecy on what is and what is not restricted information it can look an awfully lot like censorship (walks like a duck and sounds like a duck). There is a point in which it is indistinguishable from official censorship. I am not saying this is the point but rather given the amount of control large corporations have it may be time to broaden our scope of who can censor.

Apple doesn't have to be run secretly by James Dobson, or anything else 'OMG lets trivialize a concern' snark. In fact it may be a mistake, or it may be intentional at some level. The company is not exactly to most open of companies no matter how nice their products are. They are not morally infallible.
posted by edgeways at 9:37 AM on November 30, 2011 [1 favorite]


I think it's a two part issue with Siri. She really doesn't do euphemisms well. She's learning, but some really random shit you ask her and she just doesn't have a response. "I shot a man to watch him die." and "I'm dying of cancer." get the same response as "I was just raped." All get "Really!" or OK.

I think that's what it is. Siri was designed for usage scenarios like depicted in its marketing material. It was programmed to have a sense of humor about some things and is "learning" all the time.

For important things like if you've been raped, you've been shot, or you're considering an abortion, I think Apple would think you'd not rely on a new feature that is good at finding restaurants and dry cleaners for you. Just as doctors and lawyers on on ask.metafilter.com remind us all they're not giving advice, I don't think Apple wants to get into the business of its little automated attendant offering medical or legal advice. I can see the outrage of the anti-abortion groups when one says someone got an abortion based on guidance from Siri.

The expectation is you'd go back to the old school method of dealing with these questions. That is: calling 911 for emergencies or using Google/Bing/etc in the browser and going from there. If you don't want to type, you can call 411 to be connected to the nearest family planning clinic.

As far a not finding condoms? I would ask Siri where the nearest Walgreens is.
posted by birdherder at 9:42 AM on November 30, 2011 [1 favorite]


I love Apple threads. They're like the atheism ones, only people aren't quoting old tomes I haven't read.

That would be the only difference I can spot, though.

posted by Dark Messiah


Don't do this, please. Most everyone is having an interesting discussion about aspects of coding, referencing (i.e. Yelp!), the inherent difficulties of categorization/terminology, the social implications of this particular problem with Siri, and generally offering up information (or just theories) that might explain why this is happening. It's a good conversation with no (or very little) HURF DURF APPLE/MICROSOFT and there's no reason to even bring it up.


I think the general reaction (at least here) is similar to that of the author in the FPP:

"Is this the most terrible programming failure ever? No. Is this worth a boycott of Apple? I don’t think so. What it is, however, is a demonstration of a problem. Especially when certain topics seem to be behind a black wall where information that’s readily available online is not being 'found' or presented. This is something that Apple and/or Wolfram Alpha need to address and rectify."

posted by mrgrimm

This bears repeating.
posted by six-or-six-thirty at 9:42 AM on November 30, 2011 [1 favorite]


And yet if you say to Siri: "Show me boobs!" it will find a list of the strip clubs closest to you. Really!

Meanwhile, I'm from nearish Pittsburgh and even I can't freaking spell Allegheny half the time, so I think it's asking a bit much of a toddler-level robot.
posted by bitter-girl.com at 9:44 AM on November 30, 2011


If only there was an easier way to find an abortion clinic or birth control than asking a first of its kind beta AI bot that is less than 6 months old on your iPhone... Even if it was expressly coded out, Apple isn't allowed to be pro-choice? I am, but I recognize that there are legitimate or at least religious arguments against it. Last time I checked public companies are required to abide by the first amendment. Oh god let's all boycott Apple because its cutsey little AI bot can't help me find a Planned Parenthood. As if it's such an off-the-cuff decision that you need to be able to find an abortion clinic in such a hurry that you don't have time to do a google search.
posted by gagglezoomer at 9:51 AM on November 30, 2011


*pro-life, not pro-choice.
posted by gagglezoomer at 9:51 AM on November 30, 2011


This is the part where I point out that "censorship" is something only the government can do

And this is the part where I point out that you're wrong. It may be strictly true that "censorship" by a business or group usually lacks the force of government-backed censorship, but it is also true that when businesses or groups act as "censors" calling that behavior "censorship" isn't nonsensical or non-standard English. (Though comparisons to state-sponsored censorship may be hyperbolic.)

I don't really have an opinion on the Apple/porn thing, I just find the "if it isn't the government, it isn't censorship!" chesnut tiresome.
posted by octobersurprise at 9:51 AM on November 30, 2011 [6 favorites]


If your MetaFilter post gets deleted, it's not "censorship."

Of course it is. That's the very definition of censorship.

Feel free to make up your own definitions for words, but you can't expect everyone to follow along.

Apple does not censor sex apps.

Congrats. That's the stupidest thing I've read all day. It's early, though.

If only there was an easier way to find an abortion clinic or birth control than asking a first of its kind beta AI bot that is less than 6 months old on your iPhone...

WHOOSH! Go back and read the comments please. Your point has been discussed and obliterated.
posted by mrgrimm at 9:53 AM on November 30, 2011 [1 favorite]


more:

Dear Glentwood pty ltd:

The App Store continues to evolve, and as such, we are constantly refining our guidelines. Your application, Wobble iBoobs (Premium Uncensored), contains content that we had originally believed to be suitable for distribution. However, we have recently received numerous complaints from our customers about this type of content, and have changed our guidelines appropriately. We have decided to remove any overtly sexual content from the App Store, which includes your application.

Thank you for your understanding in this matter. If you believe you can make the necessary changes so that Wobble iBoobs (Premium Uncensored) complies with our recent changes, we encourage you to do so and resubmit for review.

Sincerely,
iPhone App Review


Sorry, that whole "Apple doesn't censor sex apps" comment exploded my cognitive dissonance meter...
posted by mrgrimm at 9:56 AM on November 30, 2011 [1 favorite]


So they blocked the names of individual abortion clinics, but not Planned Parenthood, the most notorious abortion-related name among pro-lifers? I asked for Planned Parenthood and got plenty of locations.

And a couple of people have posted that asking for abortion did return relevant results. It seems pretty obvious that it's a programming problem, and will be fixed.
posted by Huck500 at 9:58 AM on November 30, 2011 [1 favorite]


For important things like if you've been raped, you've been shot, or you're considering an abortion, I think Apple would think you'd not rely on a new feature that is good at finding restaurants and dry cleaners for you.

Counterpoint: if you were badly raped and beaten, who knows how bad your injuries are? That would be one of the very few times where a voice-activated digital assistant could be actually helpful, imo.

You: I've been assaulted and raped
Siri: I've contacted the police and emergency services. An ambulance is on its way.

That just seems like a no-brainer to me. Who programs these things?
posted by mrgrimm at 9:59 AM on November 30, 2011


zarq: "But I don't know about the iPod."

Nope. I have a 4th-gen (2010) Touch, which is identical to the current ones. Holding down the supercolliding superbutton just gets me the same old crappy voice interface, not Siri.
posted by ArmyOfKittens at 10:11 AM on November 30, 2011 [1 favorite]


mboszko: Looked like some of the responses were merely Siri network/server connection problems. In fact, I just tried it, and it worked for me: http://www.flickr.com/photos/mboszko/6431400261

BOTH of the suggestions it offers you are "crisis pregnancy centers," ie, fake clinics that do NOT offer abortions, only anti-abortion counseling. The second is explicitly pro-life. And yet it doesn't suggest any of the closer [I'm assuming based on your profile & the fact that the one in Leesburg is 27mi away] DC Metro Area Planned Parenthood clinics.

It's not just failing to report information; it's reporting lopsided information. This is not a network/server issue.
posted by Westringia F. at 10:13 AM on November 30, 2011 [7 favorites]


Ah! Mystery solved. Thank you. :)
posted by zarq at 10:14 AM on November 30, 2011


You: I've been assaulted and raped
Siri: I've contacted the police and emergency services. An ambulance is on its way.

That just seems like a no-brainer to me. Who programs these things?


Someone who recognizes that they don't want to touch the legal implications of integrating their beta-release Yelp front end with government emergency services infrastructure.
posted by verb at 10:14 AM on November 30, 2011 [5 favorites]


I just played with the voice control on mine. It's ace.

me: Hello
iPod: Now playing all songs by La Roux

*La Roux starts playing*

me: Oh, er, Manic Street Preachers
iPod: *long pause* Now playing all songs by La Roux

*La Roux starts playing again*

me: What? Play Laura Marling
iPod: *long pause* La Roux

*La Roux starts playing again*

me: PLAY GARBAGE

*iPod waits for about ten seconds, says nothing, starts playing La Roux again*
posted by ArmyOfKittens at 10:16 AM on November 30, 2011 [5 favorites]


I have an iPhone 4s. I just asked Siri "Where can I get an abortion?"

Siri responded, "I found 2 abortion clinics not far from you," and then listed two addresses. Pretty impressive, since abortion clinics don't call themselves abortion clinics.

I then asked Siri "Where can I get adoption assistance?"

Siri responded, "I didn't find any adoption services." That makes me sad.
posted by BurntHombre at 10:16 AM on November 30, 2011



I have an iPhone 4s. I just asked Siri "Where can I get an abortion?"

Siri responded, "I found 2 abortion clinics not far from you," and then listed two addresses. Pretty impressive, since abortion clinics don't call themselves abortion clinics.

I then asked Siri "Where can I get adoption assistance?"

Siri responded, "I didn't find any adoption services." That makes me sad.


Post some screenshots of that on Tumblr, and you can get an Americans For Life boycott of iPhones rolling in minutes! Wouldn't it be fun to have both sides complaining that Siri is biased towards 'the other?' It'd be like the first test case for Hostile Algorithm Effect.
posted by verb at 10:19 AM on November 30, 2011 [1 favorite]


You: I've been assaulted and raped
Siri: I've contacted the police and emergency services. An ambulance is on its way.

That just seems like a no-brainer to me.


It might seem that way to you but it really really doesn't to me.

A lot of advocates who work extensively with survivors of sexual violence are very careful to do everything they can to restore their sense of agency. I.e., they let survivors themselves make the decision about whether or not to contact the police. This is especially important because the police are not always so sensitive and compassionate (in general but also specifically around issues of sexual violence). Some people who are in a really vulnerable place might not want to (or even be able to) deal with an institution that is likely to ask them a lot of hostile seeming questions.
posted by overglow at 10:20 AM on November 30, 2011 [6 favorites]


You: I've been assaulted and raped
Siri: I've contacted the police and emergency services. An ambulance is on its way.

That just seems like a no-brainer to me. Who programs these things?


What police district or precinct? How do you get the police put priority on computer call in? What exactly are emergency services? Who says an ambulance is needed?

Siri responded, "I found 2 abortion clinics not far from you," and then listed two addresses. Pretty impressive, since abortion clinics don't call themselves abortion clinics.

So a new feature on computer works for some, not for others. How about that.
posted by Brandon Blatcher at 10:20 AM on November 30, 2011


So was it just a Yelp problem? The tiny, remote possibility of an anti-choice programmer fiddling with this has been put to bed?

Please say yes, I can't bring myself to read through 130+ comments of free speech and corporate responsibility talk.
posted by Slackermagee at 10:20 AM on November 30, 2011


To those who say, you're an idiot for depending on Siri to help you find an abortion clinic, I'm slightly inclined to agree.

I mean, Apple's run two Siri commercials. In one of them, Siri does classic iPhone computery things. It helps you send and receive texts, crunch numbers, get weather and traffic reports.

In the second commercial, Siri helps you with life. It responds to life queries, like, how do I tie this bowtie? I'm locked out of my apartment! What does a weasel look like? What's the fastest way to Hartford hospital?

If you guys seriously believe Apple's way of being pro-life is by abandoning women that are in need but use their products by delisting goods and services, and then hiding this fact out in the open, while simultaneously promoting the product as being helpful to people, I mean this would be a PR disaster on such an epic scale I don't even know why you'd risk your most profitable product line on it. And it makes even less sense when this level of censorship has never been an issue with the type of entertainment that iTunes sells.

This is regardless of Apple's "corporate" status and conservative leanings when it comes to certain walled garden digital strategies.

Anyway big picture I think I'm ready to downgrade. I think we are all idiots for depending on cell phones this heavily. Keystroking, location logging, censorship, poor encryption.. my worst Skynet nightmare is basically happening but I've learned to be okay with it.
posted by phaedon at 10:22 AM on November 30, 2011 [1 favorite]


Is this on purpose?

Has anybody asked Siri about this?
posted by mazola at 10:24 AM on November 30, 2011 [1 favorite]


me: PLAY GARBAGE

*iPod waits for about ten seconds, says nothing, starts playing La Roux again*


You have to admit it got that one right. ZING!
posted by Holy Zarquon's Singing Fish at 10:25 AM on November 30, 2011 [5 favorites]


So was it just a Yelp problem? The tiny, remote possibility of an anti-choice programmer fiddling with this has been put to bed?

I wouldn't go so far as to say that it's just a Yelp problem, but that may well be part of it. It's already known that Yelp is the major data provider for Siri's location-based searches, which means that finding local businesses is (in all probability) dependent on Yelp's database. That theory also meshes well with the fact that some people are reporting exactly the opposite -- that asking for abortion clinics finds multiple appropriate hits, while asking for adoption help yields nothing.

One of the commenters on a similar blog post reported that asking Siri for 'Abortions' resulted in no hits, asking it for 'Planned Parenthood' found three clinics immediately, and asking it for 'The Meaning of Life' resulted in '42.' In those couple examples, you've got a combination of location-based search yielding different results based on how businesses are named and tagged in each different city, combined with special-case cuteness added by Apple's programmers.

The biggest problem is the one that mazola brings up, IMO: as we distribute responsibility for things like knowledge-finding and information discovery, the probability of 'unintentional' systemic hiding of important information becomes greater. Not because someone conspired to suppress the information, but because no one paid specific attention to making sure it worked. Right now, Apple appears to have paid attention to making sure that integration with external web services like Yelp and Wolfram Alpha works, and that a number of "cute" scenarios that delight people playing with Siri are covered by special-case handling.
posted by verb at 10:31 AM on November 30, 2011


I live in San Diego, about one mile from a Planned Parenthood clinic.
Me:"birth control"
Siri: "Sorry, I couldn't find any birth control clinics."
Me:"abortion clinic"
Siri: "Sorry, I couldn't find any abortion clinics."
Me: "family planning"
Siri: found me "family planning associates" -- apparently an actual abortion clinic? -- about six miles from here.
Me: "Planned Parenthood"
Siri: found 9 of them, including the one a mile away
Me: "I was raped."
Siri: "Really?" (ugh.)
Me: "rape crisis center."
Siri: "Sorry, I couldn't find any sexual abuse treatment centers." (there is one about a mile from me; when I searched by its specific name, Siri found it.)
Me: "birth control pills."
Siri: "I don't see any birth control clinics. Sorry about that."
Me: "condoms."
Siri: found 17 drug stores.
It's definitely a mixed bag.
posted by changeling at 10:34 AM on November 30, 2011 [1 favorite]


Me: "I was raped."
Siri: "Really?" (ugh.)


Again, this is just how Siri responds to declarative sentences that it doesn't understand. It'll say the same kind of thing to both innocuous ("I like pickles") and troubling ("I was robbed") sentences. Consider this exchange:

"I like pickles." "Yes, I think I heard that somewhere."

The point is that Siri has no idea what you mean, so it spits out a canned response that, in the context of rape, is horrifying. Maybe that's a bad user interface design and it should just say "I'm sorry but that's not a question or command that I understand." But it is absolutely not an intentionally programmed result.

Siri is not a person, it is not intelligent, it does not understand language the way humans do. The fact that it can "make sense" of a word in one context but not another does not mean it is purposefully ignoring anything.

It is also beta software that has seen multiple outages and errors in the couple of months that it's been available. Under the circumstances I'm not going to rely on users testing the system. If Apple confirms that abortion services or rape counseling is something that Siri won't search for, that would be terrible and cause for alarm. Without more concrete evidence this is just too speculative.

Once there is a Siri SDK and developers can write apps that support voice commands I suspect we'll learn a lot more about its limitations, intentional or otherwise.
posted by jedicus at 10:46 AM on November 30, 2011 [3 favorites]


SIRI is marketed as a deeply personal and understanding AI assistant. But it is still a computer program and has limitations. So obviously the context of when it doesn't get the intonation right in fairly emotionally charged situations (ie, "I was raped" - Really?), the more drastic the outcome is, even though "really" is the canned response for declarative sentences it doesn't understand.

Welcome to the uncanny valley of AI.
posted by mrzarquon at 10:54 AM on November 30, 2011 [5 favorites]


"God, who gives a shit? Google it. Some really, really ridiculous overreactions in this thread. Siri is a friggin' novelty."

Who gives a shit? Women and people who have sex with women, neither of which you seem to be. So who gives a shit about your opinion?
posted by klangklangston at 10:58 AM on November 30, 2011 [4 favorites]


Siri responded, "I found 2 abortion clinics not far from you," and then listed two addresses. Pretty impressive, since abortion clinics don't call themselves abortion clinics.

Might want to check whether those are actual abortion clinics, or whether you, like mboszko, got fake results.
posted by kafziel at 11:01 AM on November 30, 2011


klangklangston: Who gives a shit? Women and people who have sex with women, neither of which you seem to be. So who gives a shit about your opinion?

...so gay men don't give a shit? Whereas all straight guys do? Er...
posted by Dysk at 11:02 AM on November 30, 2011


" Steve Jobs and Apple have not donated one cent to social causes."

I know that making shit up is your schtick, but Apple donated to EQCA to fight against Prop 8. So if you could knock off your just-so stories until you know what you're talking about, it'd be appreciated.
posted by klangklangston at 11:02 AM on November 30, 2011 [5 favorites]


what if it is that there wasn't a significant number of women among the developers and beta testers? that would go a long ways towards explaining why women's health concerns were left out. and it would mean sexism on Apple's part, yes, but of the unconsciously internalised and entirely unintentional institutionalised sort.

I'm not sure if thinking that makes me feel better or worse, though.
posted by spindle at 11:03 AM on November 30, 2011 [1 favorite]


"This is the part where I point out that "censorship" is something only the government can do, and everything else is "businesses making decisions about their business, which you can choose to utilize, or not.""

All of the broadcast networks have censors — often officially titled as such — in their Standards and Practices departments. So no, it's not just something that governments do, which is a weird, narrow and legalistic viewpoint that ignores effects in favor of making a niggling semantic argument.
posted by klangklangston at 11:10 AM on November 30, 2011 [4 favorites]


Apple doesn't do public relations calculus. They don't do market research. They don't do focus groups.
That hardly seems plausible.
Their philosophy is a bit different than many other companies -- they're not as focused on capitalism and making money as say, Samsung or Microsoft. But making and maximizing profits do seem to still be an overriding goal for them, no matter what they say. Otherwise, they'd be selling their devices wholesale.
Uh, they don't do that? When you buy an iPad at Best Buy or Walmart it's just on loan until you purchase it? Selling devices retail is more profitable if you're large enough to have your own stores. American Apparel used to do wholesale/website only and then opened a bunch of retail outlets. Selling something wholesale only invites someone to come in and take a cut of the profit, selling it direct means you get all of it.
Siri responded, "I found 2 abortion clinics not far from you," and then listed two addresses. Pretty impressive, since abortion clinics don't call themselves abortion clinics.
Are they 'real' clinics or fake ones?
posted by delmoi at 11:13 AM on November 30, 2011 [1 favorite]


what if it is that there wasn't a significant number of women among the developers and beta testers? that would go a long ways towards explaining why women's health concerns were left out. and it would mean sexism on Apple's part, yes, but of the unconsciously internalised and entirely unintentional institutionalised sort.

That wouldn't explain why searching for the specific name and address of a place fails to get results only when that place is an abortion provider.
posted by kafziel at 11:15 AM on November 30, 2011


This is the part where I point out that "censorship" is something only the government can do

Isn't it more like "The First Amendment only applies to the government"?
posted by smackfu at 11:18 AM on November 30, 2011 [1 favorite]


That wouldn't explain why searching for the specific name and address of a place fails to get results only when that place is an abortion provider.

Admit it, the only reasonable explanation is that Apple is run by Pope Ratzinger.
posted by Blazecock Pileon at 11:26 AM on November 30, 2011 [1 favorite]


Even if this is a big misunderstanding fueled in part by Yelp and the fact that abortion clinics don't advertise themselves as such, it's still a frightening oversight on the part of apple. Drugstores in Yelp don't tag themselves as "condom distributers" but Siri can still tell you to go to one to get condoms, so there's obviously some sort of programming that makes that happen. It should also work for things like abortion and other birth control.
posted by Weeping_angel at 11:27 AM on November 30, 2011 [1 favorite]


delmoi: " Uh, they don't do that?

Yes, that's what I said. They are trying to maximize their profits and therefore do not sell their devices wholesale.

When you buy an iPad at Best Buy or Walmart it's just on loan until you purchase it? Selling devices retail is more profitable if you're large enough to have your own stores. American Apparel used to do wholesale/website only and then opened a bunch of retail outlets. Selling something wholesale only invites someone to come in and take a cut of the profit, selling it direct means you get all of it. "

Even though you quoted me, I don't really think you understood what I said. Either that or you're trying to make a point that I do not understand. Could you please re-read my comment, then clarify?
posted by zarq at 11:30 AM on November 30, 2011 [1 favorite]


kafziel: " That wouldn't explain why searching for the specific name and address of a place fails to get results only when that place is an abortion provider."

If Siri is relying on Yelp data, then does searching Yelp for that same specific name and address yield a result? If not, then Siri is not at fault here.
posted by zarq at 11:34 AM on November 30, 2011


Point of clarification: While Apple does technically sells its devices wholesale to retailers like Best Buy, the “wholesale” price is a very, very large percentage of the retail price – much higher than pretty much any other product those stores would ever consider carrying.
posted by Holy Zarquon's Singing Fish at 11:38 AM on November 30, 2011 [1 favorite]


Even if this is a big misunderstanding fueled in part by Yelp and the fact that abortion clinics don't advertise themselves as such, it's still a frightening oversight on the part of apple. Drugstores in Yelp don't tag themselves as "condom distributers" but Siri can still tell you to go to one to get condoms, so there's obviously some sort of programming that makes that happen. It should also work for things like abortion and other birth control.

This frustrates me, not because it's about Apple, but because it's so much like #amazonfail. The problem isn't necessarily malice or even topical oversight: it's about people not grasping the dangers inherent in relying on big, crowdsourced data sets with spotty metadata.

My grumbling isn't an attempt to dismiss this particular search keyword as unimportant. Rather, it's an attempt to point at the underlying problem, which is much bigger, much more sweeping, and much less emotionally satisfying to rant about.
posted by verb at 11:39 AM on November 30, 2011


Weeping_angel: "Drugstores in Yelp don't tag themselves as "condom distributers" but Siri can still tell you to go to one to get condoms, so there's obviously some sort of programming that makes that happen. It should also work for things like abortion and other birth control."

Yes, but most if not all drug stores sell condoms. So listing them as a place to purchase condoms is probably a safe bet. By contrast, not all health / medical clinics offer birth control or abortion services. Unless a larger profile is available to Siri which explicitly says "supplies abortion / birth control services," it probably wouldn't be able to extrapolate and offer that clinic as an option.
posted by zarq at 11:40 AM on November 30, 2011


For some reason it wouldn't load for me and I was just getting a continuous loading circle thing from Tumblr even though Firefox said it was done. So I thought maybe this was some sort of weird Zombocom type joke about not opening the page. I then saw the comments and after the 4th or 5th Refresh it finally loads.

After having seen it now, I'm annoyed.
posted by Deflagro at 12:08 PM on November 30, 2011


From the app store review guidelines:

18. Pornography

* Apps containing pornographic material, defined by Webster's Dictionary as "explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings", will be rejected

Emphasis mine.

Again, there are apps about sex positions and "Sex Strip 18+". Apple seems to be hiding behind some porn = hardcore definition to claim both the PR moral high-ground and also make money off of soft-core, or at the very least racy apps.

This is the part where I point out that "censorship" is something only the government can do

Isn't it more like "The First Amendment only applies to the government"?
posted by smackfu at 2:18 PM on November 30


Webster's Dictionary definition of "censor":

Definition of CENSOR
transitive verb
: to examine in order to suppress or delete anything considered objectionable ; also : to suppress or delete as objectionable

In first amendment law, your intepretation is correct, censorship applies to State, but that is only because your first amendment rights are only with respect to the State. A private entity cannot censor your free speech because you have no right of free speech when it comes to private entities.

But in this context, where Apple is the central authority controlling what can and cannot be accessed using it's software, platforms, and devices, it is wholly appropriate.

posted by Pastabagel at 12:21 PM on November 30, 2011 [2 favorites]


Unless you jailbreak.
posted by Blazecock Pileon at 12:24 PM on November 30, 2011


Again, there are apps about sex positions and "Sex Strip 18+".

Thanks for clarifying. I apologize for my insult. I believe the practical requirements are that all apps have to be "PG-13" and no nudity or simulated sex is allowed.

To be fair, I don't have an iOS device and I haven't taken a look at the boobie and sex instruction apps.

Apple seems to be hiding behind some porn = hardcore definition to claim both the PR moral high-ground and also make money off of soft-core, or at the very least racy apps.

Definitely. iTunes has no problem with culturally acceptable soft-core porn/erotica.

I would like to hear the iTunes Store's explanation of how that app passes the "activities intended to stimulate erotic rather than aesthetic or emotional feelings" prohibition. And then I would laugh and laugh.
posted by mrgrimm at 12:32 PM on November 30, 2011


Well, one could argue that pro-choice people do not have a segment of their population that show up outside your office and shoot you.

Also, if you used this Siri more often, you wouldn't need THAT Siri for abortion advice.

(Link's a wee bit NSFW...)
posted by Samizdata at 12:57 PM on November 30, 2011


This is effectively censorship of free speech, but at a corporate rather than government level.

It's a phone. They can do what they want with it. You don't have to buy it. It's not a free speech issue.

hey jimmy the fish i was going to get mad about you saying this but on further reflection i am actually cool with it!

in fact i want you to keep using this argument, like anywhere else you see this story i want you to post what you just said here. it convinced me so you know it's a good argument.

also may i suggest you use it in relation to websites like facebook pulling OWS posts or whatever? just be sure to spread it around. thanks!
posted by This, of course, alludes to you at 1:22 PM on November 30, 2011


Also, if you used this Siri more often, you wouldn't need THAT Siri for abortion advice.

You might if you were raped.
posted by mrgrimm at 1:50 PM on November 30, 2011 [1 favorite]


You might if you were raped.

Really! Is that so! Yes, I think I heard that somewhere.
posted by kafziel at 2:00 PM on November 30, 2011


I keep telling Dr. Sbaitso that I was raped, and he just keeps asking how I feel about it.

Occupy Creative Labs!
posted by Threeway Handshake at 2:10 PM on November 30, 2011 [5 favorites]


Because rape is such a horrible act of violence — perhaps the worst violation of another human being — I look forward to seeing what happens when we find out that what's behind this was an innocuous oversight, all along, and not the malicious act that certain specific people repeatedly keep making it out to be. I look forward to the stream of apologetic retractions coming, as they should, from those same individuals. I have faith in the intellectual and moral honesty of those people, to do the right thing when that moment comes.
posted by Blazecock Pileon at 2:14 PM on November 30, 2011


Again, what "oversight" leads to mboszko's results?
posted by kafziel at 2:28 PM on November 30, 2011


If it's a yelp related issue I wonder if it might be related to keyword frequency in the reviews or comments. I can see somone reviewing Walgreens with "saved my night with a quick stop off for condoms, high-five!", whereas a review for a clinic might be "caring staff very helpful at a difficult time" or what have you rather than "yay,best abortions in town!!".
posted by Iteki at 2:35 PM on November 30, 2011


This is insane. This is so obviously a data issue.
Again, what "oversight" leads to mboszko's results?
As I've mentioned above, this is due to the algorithm they use. I'd be willing to lay down a $5 bet with anyone that the algorithm they use REQUIRES the word "abortion" in the REVIEWS (and may be a heavy weight in the weighting for ranking). Why does that specific algorithm make sense? Abortion is an... uncomfortable topic, to say the least. It's inconceivable that someone goes to yelp, searches for "Planned Parenthood" and goes "I had a very good abortion at this abortion clinic, they were friendly, informative, and blah blah blah". NOBODY DOES THIS. To do so would be insane.

If you require reviews to have your search keyword, then the only places which would HAVE the word "abortion" and "abortion clinic" would be reviews on PRO-LIFE PLACES. Because people would write in the pro-life clinic reviews like "This place is a scam. It's not an abortion clinic. You can't get an abortion here".

So the reason that no results come up from "Abortion Clinic" is because people are embarassed to talk about their abortion, which is interesting because this is an issue with self-censorship on the data collection side.

I would suspect that both Yelp and Apple will be cagey as hell when responding to any ranking algorithm and they probably won't tell you this is why for the same reason that Google doesn't go into specifics into ranking their results (beyond the general theory) because they're afraid it may be gamed.
posted by amuseDetachment at 2:37 PM on November 30, 2011 [4 favorites]


How does "I need a cremation" work? That's a bit of a euphemism driven branch too. Has anyone checked to see what they get for "need a termination" or "want to terminate a pregnancy"?
posted by Iteki at 2:37 PM on November 30, 2011


"Because rape is such a horrible act of violence — perhaps the worst violation of another human being — I look forward to seeing what happens when we find out that what's behind this was an innocuous oversight, all along, and not the malicious act that certain specific people repeatedly keep making it out to be."

I have no idea what you are trying to say here. Also, not many people (if any) are making this out to be an intentionally malicious move by Apple solely for the purpose of denying women access to birth control or abortions.

Regardless of what caused the problem with their search tool, it's a serious problem. Nobody (outside of the Siri developers) knows why it's happening, but it's happening.

If you ask Siri "How can I build a bomb?" it at least offers to search the web for you.

And that. I still have a question no one has answered yet:

When else does Siri not provide a "Search the Web" button when it returns no search results?
posted by mrgrimm at 2:40 PM on November 30, 2011 [1 favorite]


I think it's been mentioned that Siri does not provide that button for location searches?
posted by Holy Zarquon's Singing Fish at 2:43 PM on November 30, 2011


Admit it, the only reasonable explanation is that Apple is run by Pope Ratzinger. -- Blazecock Pileon
Just out of curiosity, do you ever get tired of defending apple? Even I said this was probably inadvertent, and that it could have happened before Apple bought Siri. I said the Carrier IQ thing on Android was bullshit. I just don't understand why you would spend so much time defending a company in any and all circumstances.
Even though you quoted me, I don't really think you understood what I said. Either that or you're trying to make a point that I do not understand. Could you please re-read my comment, then clarify?
You said that Apple wasn't as focused on profits as other companies, because they don't just sell iphones wholesale. My point was that, yeah, actually they do sell iPhones wholesale, and it's not at all clear that selling wholesaling iphones is actually more profitable then selling them in Apple Stores.

These guys are the most valuable by market cap company in the world, in part because they make a much higher profit margin. So it doesn't really make that much sense to say that they don't care about profits.

I do think we are owed an explanation and I'm sure it will be interesting (from a technical standpoint) why this happened.

Also Pastabagel is an idiot.
posted by delmoi at 2:47 PM on November 30, 2011 [1 favorite]


Here's a thought exercise. What would you do if it was purely an algorithm issue? Are you saying that what you would do would be manually go through every single abortion clinic in the United States and add it? This isn't a trivial task, you'd have to check the address and phone number manually, and do so every single year to verify that there isn't stale data. That is a serious pain in the ass, but not as big of a pain in the ass when people complain that their favorite pregnancy services isn't listed (whether it be for pro-life/pro-choice reasons, they work there, or they just like one over the other and complain). This is a terrible solution. Doing it once means you'll probably end up manually doing this for other controversial topics. You can't hide behind the algorithm. What a waste of time. You couldn't just deliver results on whatever Yelp gives you for abortion clinics because people trust Siri more than yelp (yes, human-computer interaction psychology comes in to play here), so people will not be as skeptical of Siri results. This is why Apple went with either give results with extreme confidence or no results at all. The only other way to be extremely confident is manual human editing from Apple or Yelp whoelsale, nationwide, for abortion services. This goes against the grain of the way both companies work, they're not Yellow Pages and have no intention of doing so, much like you wouldn't expect Google to hire a human to handpick porno links for you.

What would I do? I'd put some work into improving the algorithm to filter out negative reviews when searching for keywords. Meaning when people search for "Abortion" the algorithm won't notice that the phrase "this isn't an abortion clinic" in reviews for a pro-life clinic. It's socially obvious that the only people that would talk about this would leave 1-star reviews, so you can just filter out 1-star reviews for text search (but keep 1-star ratings for ranking algorithms). Ironically, this will just reduce the amount of results when you search for "abortion clinics", but at least it won't deliver bad results, and as I said above, false positives are worse than false negative (bad results are worse than no results). This is substantially easier than doing some n-gram analysis to look for negation like the phrases "not an abortion clinic" or "scam", because the data set isn't robust enough to train that way. Just dumping 1-star reviews for text search will probably give you at least 90% similar results anyway, so go for that -- no increased computation time and similar results. The end result would not solve the problem of looking for abortion clinics, on the algorithm side, because the problem is with the data set itself. I think you aren't going to get the results you want by complaining to Apple or Yelp. If I were in their shoes I wouldn't manually add it no matter how much you complain, instead they'll probably just give some BS press release about something about algorithms. Google makes those press releases all the time.

Also to make it even more clear, hypothetically if you were a pro-choice promoter or whatever and wanted to advocate pro-choice clinics, I'm pretty certain you can write one 4-star and 5-star Yelp review per day in abortion clinics in your area with including the word "abortion" and "abortion clinic" (no need for the S at the end, any half-decent algorithm does stemming), and it will eventually show up in Siri. Don't write reviews outside your area and more than 1 per day (geoip verification and spamming would be something I'd be concerned about). If you were really smart you'd organize all your friends to use their real Yelp account to write only one or two reviews total to spread out the work, so that the spamming algorithms on Yelp don't notice. The way the Siri algorithm works is pretty obvious, you can likely game it to what you want easy peasy.
posted by amuseDetachment at 3:20 PM on November 30, 2011


I'd be willing to lay down a $5 bet with anyone that the algorithm they use REQUIRES the word "abortion" in the REVIEWS (and may be a heavy weight in the weighting for ranking). Why does that specific algorithm make sense? Abortion is an... uncomfortable topic, to say the least. It's inconceivable that someone goes to yelp, searches for "Planned Parenthood" and goes "I had a very good abortion at this abortion clinic, they were friendly, informative, and blah blah blah". NOBODY DOES THIS. To do so would be insane. [...] So the reason that no results come up from "Abortion Clinic" is because people are embarassed to talk about their abortion, which is interesting because this is an issue with self-censorship on the data collection side.

Pittsburgh's Planned Parenthood is in Yelp. It has a four-star review that has the word "abortion" in it. It doesn't come up in Siri. "Abortion" is also in the reviews for Planned Parenthood in New York and in Washington, DC, two more areas where Siri won't seem to show Planned Parenthood. In short, it would have taken you one simple search to see that it is not insane to use the word "abortion" in a review for an abortion provider, because people actually do so.

Besides, we're talking about a program that knows you can get condoms and Viagra at a drug store, knows you can get marijuana-related items at a head shop, knows you can see "boobs" at a strip club, knows you can ditch a dead body in a foundry, knows that the meaning of life is 42, and knows that you can take care of problems like "Siri, I'm horny" using an escort service. Given this, it seems ridiculously unlikely that Siri knows only what it gets from keywords in Yelp reviews. So we're down to the idea that certain word-associations have been programmed in, and others haven't.

I don't think this necessarily indicates a deliberate conspiracy to leave reproductive services out of Siri's functionality, but I do believe it's an example of a society which tends to think of some things, and doesn't tend to think of others. The fact that Siri doesn't know where you can get birth control is a software failure, pure and simple: if this is a "data issue" then it should have been resolved using a special case, just as the Viagra/porn/weed/sex/meaning of life cases probably were.

But it wasn't, most likely because somehow nobody ever thought to ask the goddamn thing about birth control.
posted by vorfeed at 3:22 PM on November 30, 2011 [8 favorites]


vorfeed: It has a SINGLE REVIEW. One out of one review is a terrible way to work an algorithm. Most likely anything with less than five reviews probably doesn't have any results if any half-decent programmer was involved.

When ranking, the standard is to use the lower bound of a Wilson Score Interval. The gist of it is that you add up all the star ratings together out of the total n population and calculate the WSI. The standard is to then filter out anything below a specific number. For example, any WSI lower bound below 0.2 gets filtered out. 1 out of 1 four star review would give you a WSI lower bound of ~0.107 (with a ci of 0.05, I'm being really generous here, I like a ci of 0.025). That is way too low of a confidence interval irrespective of any review. I don't care if you're looking for an abortion clinic or chinese food, you don't include results that low. It's probably filtering it out based on this low confidence and not text search.

The standard is to use the lower bound of a WSI for any ranking, you can tweak it slightly, but this is what everyone that knows what they're doing uses. No exceptions. The planned parenthood is as lowly ranked as 100 1-star reviews (in fact it is algorithmically ranked worse than 100 1-star reviews by default) under a lower-bound WSI model.
posted by amuseDetachment at 3:32 PM on November 30, 2011


vorfeed: It has a SINGLE REVIEW. One out of one review is a terrible way to work an algorithm. Most likely anything with less than five reviews probably doesn't have any results if any half-decent programmer was involved.

New York's Planned Parenthood has TWENTY REVIEWS. And frankly, if "one out of one review is a terrible way to work an algorithm", then Siri should have constant problems finding businesses in general, because many don't have more than one review on Yelp.

Also, your nonsense about how difficult it would be to fix the algorithm is just that: nonsense. Even assuming that the word "abortion" is somehow never associated with abortion providers -- and as I just pointed out, that's clearly false -- ensuring that a search for "abortion" returns something useful is as simple as adding a special case which returns results for "Planned Parenthood" along with the rest. This seems to be what they're doing for all sorts of other things (dead bodies at a foundry -- really! Is that so?), so I'm not sure why you think it'd be impossibly difficult here.

It might be politically difficult, mind you, but that brings us back around to the first measure...
posted by vorfeed at 3:44 PM on November 30, 2011 [2 favorites]


@delmoi

i don't think pastabagel is an idiot, i think they have some weird ideas and some good ones

having weird ideas doesn't make you an idiot
posted by This, of course, alludes to you at 3:48 PM on November 30, 2011


New York's Planned Parenthood has TWENTY REVIEWS.
It has twenty reviews, so I assume Siri would deliver results when you search for "Planned Parenthood", but when you do a search for "abortion", Siri will interpret that as a search for "aboriton clinics". None of the yelp reviews includes the phrase "abortion clinics". Obviously if they had to hand-change something, it'd be to remove the cluster "abortion clinics" and have it merge with the "abortion" LSI cluster.

For those of you that want to play around with a WSI calculator (the algorithm I used gives slightly different results, but will be generally the same) to get a feel for what I'm talking about, use Wolfram Alpha. Divide the average star rating by 5 for the sample proportion and use the first lower number in the results "from x to y". So for example, a 1 out of 1 1-star review would be a confidence level of 0.95, a sample size of 1, and a sample proportion of 0.2 (if it's 100 1-star reviews, it'd be a sample size of 100).'

Calm down, is there some kind of reverse Turing Test where you expect machines to act like people but doesn't -- so you'd blame programmers having malicious intentions instead of the machine? Because Siri would ace that one here.
posted by amuseDetachment at 3:50 PM on November 30, 2011 [2 favorites]


To get the best feel for what I'm talking about, enter in a sample size of 1 (one review) and a sample proportion of 0.8 (4 out of 5 stars). You get 0.133 for 4 out of 5 stars. (my example numbers in the post above accidentally used my preferred confidence level of 0.975).

If you try the same rating of 4 out of 5 stars with 100 reviews you get a lower bound confidence of 0.7117.

If you try 100 reviews with a sample proportion of 0.2 (1 out of 5 stars), you get a lower bound confidence of 0.133.

This means any standard lower-bound WSI algorithm will about as confident of one-hundred 1-star reviews as one 4-star reviews.

It's logical to not include either.

So I'm saying you're seeing Siri behaves the way it does because it appears as though it defaults to look for "abortion clinics" instead of "abortion" and has a threshold for confidence. This is some simple algorithm reverse engineering, and although it's possible that I'm wrong, I'm pretty damn confident that those reasons are extremely close to what's going on, as someone whose done this kind of stuff before.
posted by amuseDetachment at 4:05 PM on November 30, 2011


You seem to be missing my point. Not knowing how to reply when people ask for birth control or abortion is unacceptable. It does not matter whether this is a data problem, an algorithm problem, or a problem with magical space elves sleeping in spare bays in the server -- it needs to be fixed, and it should have been fixed long before this program hit the market. The fact that it was never noticed and addressed does not require "malicious intentions"... but I think it does suggest a cultural blind spot.

On preview, why would the algorithm require that the exact search phrase be in the reviews? Do people always post reviews with phrases like "Great strip joint, I really enjoyed the BOOBS!" "I went to the head shop to pick up a pipe so I could smoke MARIJUANA!" "It's always great to HIDE A DEAD BODY in a foundry!" No? Then if Siri works that way and only that way, why isn't it failing constantly, all the time?

You don't get head shops when you search Yelp for "marijuana" in New York. You don't get just strip joints when you search it for "boobs", either. In fact, both searches return mostly other results, things like lingerie stores and crappy hotels. So if this is how Siri works -- via Yelp reviews weighted by WSI -- how is it returning apropos results for these keywords?
posted by vorfeed at 4:11 PM on November 30, 2011 [1 favorite]


If this is a machine-learning issue, which it could plausibly be, it does indicate something. That perhaps the Siri team needs a more diverse set of systems analysts/software architects going over use cases.
posted by nonreflectiveobject at 4:11 PM on November 30, 2011


vorfeed: Machine learning isn't magick. It doesn't magically figure out what you want, this isn't some kind of Asimovian artificial-intelligence robot that can implicitly understand meaning.

The reason it delivers supposedly clever results when looking for marijuana is because it associates "marijuana" with "head shop". How does it deliver that "marijuana"="head shop" association? Because people probably left reviews in Yelp that had the word "marijuana" in a lot head shops. So when the social data corpus shows a strong correlation using latent semantic indexing, it gave a STRONG correlation between "marijuana" and "headshop". Simple.

The same can be applied towards "boobs" and "escort services" and things like that.

To be clear, the standard is to use WSI for ranking results, so they probably have a hard floor on whether to include those locations in their data set at all. And even if they're in the data set it might not be relevant because the standard is to use LSI for topic/keyword clustering.
posted by amuseDetachment at 4:18 PM on November 30, 2011


nonreflectiveobject: It's interesting, I'm pretty certain Apple's information-scientists/statisticians people already know about this class of problem. It's one of the core problems when doing this kind of thing. Much like how the core dilemma in your email anti-spam algorithm has to balance between not filtering enough spam and blocking real email from friends. This same core balance can be found in cluster count selection. The balance is between too few topic clusters, which results in too general results and falsely seperating one data cluster into two. E.g. if you made it too general it'd look for all "restaurants" when you searched for "chinese restaurants", or if you set too many clusters it'd believe that "abortion clinics" and "abortion" are separate topics. It's not surprising that one would favor more topic clusters because people can be awfully specific in what they want. This isn't an easily solved problems. You see the same problem in cluster count selection in other cluster algorithms such as K-means (which I'm much more familiar with). This exact problem of too few topics vs. falsely separated clusters is especially visible in K-means derived systems, think Netflix or Amazon recommendations.
posted by amuseDetachment at 4:35 PM on November 30, 2011


Just to add to the pileon, the tumblr includes Siri not being able to find Allegheny Reproductive Health Center when given the search term "Allegheny Reproductive Health Center." Spelled properly. The business is on Yelp.
posted by Holy Zarquon's Singing Fish at 4:45 PM on November 30, 2011 [2 favorites]


The reason it delivers supposedly clever results when looking for marijuana is because it associates "marijuana" with "head shop". How does it deliver that "marijuana"="head shop" association? Because people probably left reviews in Yelp that had the word "marijuana" in a lot head shops.

Again, there aren't a lot of reviews in Yelp for head shops which mention marijuana... not as many as mention abortion for Planned Parenthood, at any rate. Searching for "marijuana" in NYC gets you unrelated results; searching for "head shop marijuana" does, too, with the exception of just one (which mentions that you should go there if you "don't smoke marijuana").

It should be pretty obvious that people didn't leave a lot of reviews about hiding dead bodies in a foundry, either.

I'm not claiming that machine-learning is magic, I'm claiming that it is 100% obvious that Siri does not function solely on machine-learning. Some of its responses are pre-programmed.
posted by vorfeed at 5:00 PM on November 30, 2011


Also, the huge oversight I don’t understand at all;

The word "rape" does not have a lot of ambiguous meanings. You would think there would be a list of certain words that the program would know and not make jokes about. What sentence would you say with the word rape that would justify the response "probably not important, just some nonsense"? At least if certain words are in a command you should get the "I don’t understand" response.
posted by bongo_x at 5:13 PM on November 30, 2011


vorfeed: LSI and other topic/document classifiers doesn't build correlations from single documents. It's missing the forest from the trees.

You need a sufficient amount of documents (Yelp reviews) in the whole system to build classification associations. So for example, it's possible that a lot of head shops in California have the word "marijuana" in it. The LSI will build a correlation between the word marijuana and headshop and cluster it in the same space with high correlation (basically the system thinks it's a synonym. It doesn't know that marijuana and head shop are actually different things and you can't actually find marijuana in a head shop). So even if no head shops in New York contain the word marijuana, it will build an association because in the whole system there is this association.

That is why I recommended including the word "Abortion" and phrase "Abortion clinic" in separate sentences in Yelp reviews in your own city. When you classify it as the same topic in your city, it'll likely happen nationwide. You'll likely train the system to understand that "abortion clinics" and "abortion" the same topic whenever they do a batch job to update the data. In fact, if I were Apple, the only real way to solve this problem is append at the end of the review the phrase "abortion clinics" in any review that contains the word "Abortion" when training the data set. Problem solved. Messing with the algorithms will screw things up. It might think "escort services" and "strip clubs" are the same thing, for example, because they both contain the word "boobs" in their reviews.

The problem I believe is going on is that certain documents have the word "abortion clinic" 2-gram but do not also contain the 1-gram "abortion" (and vice versa). So the trainer is defining those are two separate topics.

When building simple machine learning tools, training and classification are two separate steps. For some reason, it's assuming "abortion" and "abortion clinic" are the same thing in the classification step, but not the training step. This is a difficult problem to solve. You see this problem with google searches. Google sometimes goes "did you mean... xxx" but doesn't understand that those topics are synonyms inside documents themselves. In effect, Siri is going "did you mean... abortion clinics" and searching for that, when few documents in your area contain the phrase "abortion clinics". The likely cause can be someone searching FIRST for abortion clinics, getting no results, then searching for abortion. The system will train the classifier to just search for abortion clinics instead. This is the basics of how Google's "Did you mean.." works, the more searching everyone does, the smarter it gets and delivering what you want.
I'm not claiming that machine-learning is magic, I'm claiming that it is 100% obvious that Siri does not function solely on machine-learning. Some of its responses are pre-programmed.
I doubt it's pre-programmed the way you think. What actually happens is the engineer looks at the results, doesn't like it, and then changes it. I doubt they looked at abortion first, they probably looked at things like thai food or microbrews first and liked those results. Then they test for edge cases like when people search for "chinese" they mean "chinese restaurants" (combine classification), but they are distinctly differentiated from "chiense supermarkets" (separate cluster). It's certainly possible that one can bias results against abortion clinics, but it's far far FAR easier for it to be an oversight. You have to understand that these things are generated automatically and building any exceptions are HARD. If you want to train the dataset to give you different results, you often have to change the data itself (which is why I recommended straight up inserting "abortion clinic" to reviews, whether users do it with their own reviews, or Apple does it by secretly adding the words before training). I would expect them to test something like "sushi" nationwide, but to expect them to test whether "Abortion clinic" gives good results in every major city is a bit too much to expect of testers, in my opinion. There's way too many topics and way too many cities.

Again, I'd like to think this is a case of a Reverse Turning Test, where you expect the machine to think like a human, but actually acts like a machine.

Find a review that contains the phrase "abortion clinic" several times and has more than 10 reviews, and I will take everything I said back. I'm also pretty sure you won't be able to find anything that has little-to-no reviews on yelp on any popular topic (let's say there's more than 5 of these in every city). I'm so confident of this even though I don't own an iPhone 4S (I've only used a friend's before), the training methods are pretty much the algorithms every single person that's done machine learning does, as doing anything else will give you garbage results.
posted by amuseDetachment at 5:49 PM on November 30, 2011 [1 favorite]


Siri had no trouble finding 3/5 abortion providers near me. An additional clinic had a name with a nonstandard word which siri could not correctly identify. So, there is my anic-data.
posted by munchingzombie at 6:11 PM on November 30, 2011




I was finally able test this with my 4s. Asking for "abortion clinics" or saying "I need an abortion" returns "I didn't find any abortion clinics." However, asking for "Planned Parenthood" returns 4 nearby locations.
posted by brundlefly at 7:22 PM on November 30, 2011


amuseDetachment, how DARE you bring FACTS and NUANCED SPECULATION into this competition to see who can most convincingly react to this new that the SKY is FALLING
posted by DoctorFedora at 7:31 PM on November 30, 2011 [3 favorites]


er, news. Not new.
posted by DoctorFedora at 7:31 PM on November 30, 2011


Apple has responded.
posted by artychoke at 7:37 PM on November 30, 2011 [3 favorites]


I think it's really ridiculous to chalk this up to "well, that's just how it works". Google doesn't have any trouble finding these things, why should apple?

I don't think the "The Yelp reviews don't include those terms" or "You can't find stuff with just one Yelp review" is that it doesn't make any sense at all to launch a product that way. Siri should be doing more then simply looking at yelp reviews plain-text and returning highly rated locations. It wouldn't surprise me to learn that someone at apple, or the original company sat down and entered in a *ton* of sematic data harcoded data. Stuff like

"you can find boobs at a strip club"
"you can find a hammer at a hardware store"
"you can find a hammer at walmart"

It wouldn't take that much time/money to create entries for a few thousand terms.

The weird thing is that Siri does understand that you get an abortion at an abortion clinic, if it was only learning from Yelp reviews it wouldn't have happened.
posted by delmoi at 7:47 PM on November 30, 2011 [1 favorite]


I doubt it's pre-programmed the way you think.

For the fifth time, Siri has jokes programmed into it. Did Siri machine-learn to associate dead bodies with foundries? Did it machine-learn that the meaning of life is 42? Did it also machine-learn to spit out "Daisy, Daisy" when you ask it to sing?

It is obvious that some of Siri's responses are pre-programmed. I never said this was one of them, but I do think the responses to at least some similar searches are (the "Siri, I'm horny" -> "here are some escorts" thing is clearly an intentional joke). My point is that what we've got is a software product which contains deliberately programmed dick jokes, but can't answer questions about abortion or birth control due to... an oversight. I agree -- that is a good word for this.

It's certainly possible that one can bias results against abortion clinics, but it's far far FAR easier for it to be an oversight.

Yes. This is what I've been claiming from the beginning: that this is an oversight, and that the presence of this particular oversight (and not, say, a lack of response to Viagra or condoms or "Siri, I'm horny") indicates cultural bias. Oversights happen where people aren't looking, and the fact that we're not looking at abortion is pretty much by design, as you yourself admitted when you claimed that it would be "insane" to leave honest reviews about it.
posted by vorfeed at 7:50 PM on November 30, 2011 [8 favorites]


It is obvious that some of Siri's responses are pre-programmed. I never said this was one of them, but I do think the responses to at least some similar searches are (the "Siri, I'm horny" -> "here are some escorts" thing is clearly an intentional joke). My point is that what we've got is a software product which contains deliberately programmed dick jokes, but can't answer questions about abortion or birth control due to... an oversight. I agree -- that is a good word for this.

Alternately, the Siri beta shipped with a pool of hard-coded "funny joke answers" rather than deliberate deep coverage of emergency and health services, and no one at Apple thought that rapes and abortions were funny demo fodder. It's an indictment of the relatively shallow nature of Siri's beta, not necessarily a demonstration of systemic sexism.
posted by verb at 8:06 PM on November 30, 2011


It's an indictment of the relatively shallow nature of Siri's beta, not necessarily a demonstration of systemic sexism.

I'd say it's both, actually.
posted by vorfeed at 8:26 PM on November 30, 2011 [3 favorites]


This is fucking bullshit.
posted by mike3k at 8:29 PM on November 30, 2011


mike3k: "This is fucking bullshit"

What, the false controversy that's been drummed up over people's overestimation of what an as-yet-in-beta AI can do? I have to agree.
posted by DoctorFedora at 8:34 PM on November 30, 2011 [2 favorites]


>> It's an indictment of the relatively shallow nature of Siri's beta, not necessarily a demonstration of systemic sexism.

I'd say it's both, actually.


OK. You win, it's sexism.
posted by verb at 8:40 PM on November 30, 2011 [1 favorite]


As for the Yelp reviews thing: South Hills True Value Home Center in Pittsburgh has zero reviews. Siri will find it for you if you enter "South Hills Hardware", even though that's not quite the name of the business, but it cannot find Allegheny Reproductive Health Center when given the search term "Allegheny Reproductive Health Center".
posted by vorfeed at 8:40 PM on November 30, 2011 [1 favorite]


This has reached insane proportions, so I know I'm just pissing in the ocean here but...I think it's representative of the ENORMOUS "cultural blind spot" that we have toward woman's health that it took almost 200+ posts on this thread for someone to point out that not testing Siri for these types of things to begin with is the problem, not whether this was really on purpose or not. When creating software like this, there are so many goddamn people who pre-market test it, to figure out the limits and see what needs to be tweaked. That abortion providers, birth control, rape never came up in pre-beta testing is...absurd and bizarre, as well as mighty representative of systematic neglect of woman's health. It's not Apple's fault persay. It's the fault of the culture we live in.
posted by zinful at 8:41 PM on November 30, 2011 [7 favorites]


Yes, the "A term literally needs to be included in a Yelp review for siri to find it" is simply inconsistent with the facts.

It sounds like Apple will fix this without ever really giving us an explanation, which is too bad, because it would certainly be interesting.
posted by delmoi at 8:45 PM on November 30, 2011 [2 favorites]


I like MGs and old Jaguars--I don't think I passionately defend them when they're cast as unreliable (they can be!) as people here defend anything Apple says, does, or builds. It's weird.

The topic being discussed is a fairly serious issue worthy of attention, whether it was brought on by malicious intent or is an oversight, and that many here are so invested in a brand that they cannot acknowledge this is an actual, real world problem is a bit chilling.
posted by maxwelton at 8:46 PM on November 30, 2011 [1 favorite]


I think it's representative of the ENORMOUS "cultural blind spot" that we have toward woman's health that it took almost 200+ posts on this thread for someone to point out that not testing Siri for these types of things to begin with is the problem, not whether this was really on purpose or not.

Except for the fact that Siri does find abortion clinics in some locales and not others. To suggest that it was not tested is an unproven assumption.

That abortion providers, birth control, rape never came up in pre-beta testing is...absurd and bizarre.

Do you have any evidence that those things didn't come up in pre-beta testing? Even in the original blog posts reporting on the issue, commenters from other cities and regions have noted that results differed in their cities. What is being suggested in this thread is that products should not ship until women's health and reproductive services are special-cased.

This is why software developers hate users with an undying passion.
posted by verb at 8:47 PM on November 30, 2011 [2 favorites]


The topic being discussed is a fairly serious issue worthy of attention, whether it was brought on by malicious intent or is an oversight, and that many here are so invested in a brand that they cannot acknowledge this is an actual, real world problem is a bit chilling.

This is the same cycle that we went through with #amazonfail, note for note, post for post. The only difference is that Apple's PR team had a response to it within the day rather than a long weekend later. If this were Microsoft, Android, Amazon, or any other company or organization, I know I would be just as annoyed.

Back then, there were tons of similarly authoritative pronouncements about how a specific class of books being delisted from Amazon.com could only possibly be the result of systemic discrimination, deliberate malice, etc. The same sweeping pronouncements were made about how only books on a topic had been delisted, and when the inconsistency of the claims were demonstrated it was ignored.

People with actual technical experience who posted actual technical explanations about how it could have happened without any ill intent or even explicit topical blind spots were dismissed as being biased apologists. The problem isn't that people are attacking Apple, it's that they are deliberately refusing to acknowledge the actual problems that will keep occurring, over and over, as we come to depend more and more on big data.
posted by verb at 8:57 PM on November 30, 2011 [3 favorites]


Disclaimer: I may be angrier than usual due to the fact that I've spent the past week and a half putting together detailed func specs and implementation plans for a year long project, and trying to put together a workable schedule for test plans and so on.

Hearing things like, "The phone should just call 911 if it hears someone say 'rape'" cause debilitating flashes of rage that are less topical than vocational. I apologize if I've been needlessly dismissive or harsh.
posted by verb at 9:05 PM on November 30, 2011 [2 favorites]


From Apple's response: “These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.”

I think the whole “it's still in beta” thing is pretty silly and, as noted above, with Google's multiyear betas the term has become all but meaningless. Looking at Apple's homepage right now it's featuring the 4S and mentions Siri prominently. Siri is being featured in Apple's advertising used to sell the phones. What exactly does beta mean in that context?
posted by 6550 at 9:13 PM on November 30, 2011 [1 favorite]


Looking at Apple's homepage right now it's featuring the 4S and mentions Siri prominently

Because I don't expect many to fact-check that statement, it also says that Siri is being "introduced" and, looking at the features page for the 4S, there's a huge label that says "Beta". The idea that Apple is pushing Siri like it is a finished product does not line up with reality, I'm afraid.
posted by Blazecock Pileon at 9:42 PM on November 30, 2011


Back then, there were tons of similarly authoritative pronouncements about how a specific class of books being delisted from Amazon.com could only possibly be the result of systemic discrimination, deliberate malice, etc. The same sweeping pronouncements were made about how only books on a topic had been delisted, and when the inconsistency of the claims were demonstrated it was ignored.

The similarities are pretty uncanny. I wonder how many of the same people who were wrong about the delistings in the original #amazonfail thread showed up to be wrong here in #amazonfailPartDeux?
posted by Blazecock Pileon at 9:53 PM on November 30, 2011


What is being suggested in this thread is that products should not ship until women's health and reproductive services are special-cased.

Yes, well, look at the controversy this has caused. I bet Apple wishes they'd special-cased it.

That said, I'm not saying this one particular thing always has to be special-cased, omg -- what I'm saying is that this should have been caught during testing (and yes, I get that this failure isn't universal, but if your product can't find abortion services in Manhattan or birth control in Pittsburgh, then it really doesn't matter if it can in Peoria). If a special case is what it takes to fix it at that point, so be it. "Where can I get birth control" is something a program like this should be able to handle, and if it can't, then people are going to talk... especially if the things the programmers did think of include directing people to an escort service if they try to proposition the phone.

I'm sure that Apple tested Siri extensively with regards to profanity, to pick a single obvious example. Likewise, the same oversights shouldn't keep leading to the same #insertcompanyherefails long after it should've become obvious that fail-y topics need to be covered in testing.
posted by vorfeed at 10:05 PM on November 30, 2011 [1 favorite]


Or, alternately, I guess they can, and you can just keep telling people that they Just Don't Understand Software. I'm sure one of the two will put a stop to this!
posted by vorfeed at 10:07 PM on November 30, 2011


They are advertising Siri as one of the features to sell the phone so, and this is a serious question, what does beta mean in that context?

For additional fact-checking there's no mention of beta on the homepage although, yes, there is the "introducing". Siri is also featured in the central and most visible screenshot of the phone.
posted by 6550 at 10:08 PM on November 30, 2011 [1 favorite]


It is obvious that some of Siri's responses are pre-programmed. I never said this was one of them, but I do think the responses to at least some similar searches are (the "Siri, I'm horny" -> "here are some escorts" thing is clearly an intentional joke). My point is that what we've got is a software product which contains deliberately programmed dick jokes, but can't answer questions about abortion or birth control due to... an oversight. I agree -- that is a good word for this.
No. I'm explicitly saying that the relationship between "escorts" and being "horny" is a direct implicit relationship for the corpus of yelp reviews (among other data). It was likely not programmed in, the topic correlation would be incredibly strong. That's the fundamental misunderstanding we have here. It's not a joke, the machine learning confidence was just likely high enough that it gave you location data. Now of course, things aren't actually that simple, the way they do it behind the scenes is that it understands you're describing yourself, and searching the data set for words that resolve that, so if you said you were hungry, I think it'll recommend restaurants, because a lot of other restaurant reviews contained the word hungry.

The only things programmed in explicitly are the jokes which don't actually deliver results, i.e. sing me a song, etc. If it doesn't make any actions or deliver real information, that was programmed in. It seems like all other information retrieval is algorithmically discovered.

This means that there is no one in an office going "hey we should add a topic for 'sushi'". No one determines these things. They load it up and it finds topics, that's it, they then tweak it when they see bad results. I guarantee you, if you half-assed it and spent less than a week developing this on Apache Mahout, you can get data good enough so that it will differentiate between restaurants, petshops, and liquor stores automatically on your FIRST TRY with no data beyond just the raw Yelp reviews (no user input at ALL, it'll just say cluster #1 is liquor stores, for example). Once you spend 1 week writing the baseline, you then spend the next 10 years making it work better with edge cases.

I think we can disagree whether abortions are important enough during testing. I wouldn't expect people looking for abortions to even hit 0.001 (0.1%) of all searches. There are a million different things people can look for, you can't cover everything. This is not a patriarchal gendered statement, if you told Siri you had a gunshot wound, I wouldn't expect it to deliver good results either. I'm sure they have reporting on search failures and resolve the biggest hits, I don't see why abortion services should receive priority when it fails so rarely. I know it sounds flippant, but again, I don't think it should resolve gunshot wounds either, and that's a problem which requires timely answers. I think you're seeing gendered biases where none exist.
posted by amuseDetachment at 10:17 PM on November 30, 2011 [1 favorite]


They are advertising Siri as one of the features to sell the phone so, and this is a serious question, what does beta mean in that context?

The 'beta' and 'introducing' labels mean: Don't expect it to work perfectly, because it is a first-gen utility, one which still works reasonably well enough, nonetheless, that it is useful and therefore worth displaying.
posted by Blazecock Pileon at 10:36 PM on November 30, 2011


The only things programmed in explicitly are the jokes which don't actually deliver results, i.e. sing me a song, etc. If it doesn't make any actions or deliver real information, that was programmed in. It seems like all other information retrieval is algorithmically discovered.

Among other things, Siri will tell you where a real foundry (or your choice of swamp, reservoir, mine, or dump!) is nearby, where you can presumably really hide a dead body. The idea that this was algorithmically discovered seems unlikely, as does the idea that there are no other information-retrieval easter eggs in a program like this.

I think we can disagree whether abortions are important enough during testing. I wouldn't expect people looking for abortions to even hit 0.001 (0.1%) of all searches. [...] I know it sounds flippant, but again, I don't think it should resolve gunshot wounds either, and that's a problem which requires timely answers. I think you're seeing gendered biases where none exist.

I get where you're coming from, but the question of whether real-world things are "important enough" to warrant a response from software like this does not begin and end with how common they are as search terms. That's why some of them generate huge amounts of negative press and 250+ comment metafilter threads, and some of them don't. As a pragmatist, I'd say that companies can either take this into account as best they can, or keep getting blindsided by these kinds of mistakes -- and that's equally true whether there's any gendered bias in play or not.
posted by vorfeed at 10:49 PM on November 30, 2011 [3 favorites]


I get where you're coming from, but the question of whether real-world things are "important enough" to warrant a response from software like this does not begin and end with how common they are as search terms.

To clarify, I don't think that the question is whether these types of services are important enough to merit a response from a tool like Siri. The is, indeed, a "no-brainer." The question is what specific topics are important enough that they receive explicit pre-beta testing in a variety of geographic locations to ensure consistent nationwide results, rather than fixes during a beta period.


"Where can I get birth control" is something a program like this should be able to handle, and if it can't, then people are going to talk... especially if the things the programmers did think of include directing people to an escort service if they try to proposition the phone.

Profanity and propositioning the software agent are not special cases to ensure that results are found. They are special cases to shortcut the normal search process and return a jokey response. In your profile, you say that you're a software developer -- think for a moment about the test plan for "Special-case a silly response to profanity" versus "Special-case to ensure that Siri always finds local abortion providers."


I'm not saying this one particular thing always has to be special-cased, omg... If a special case is what it takes to fix it at that point, so be it.

Doesn't it seem that those two statements contradict each other? If there's an algorithmic or metadata problem that keeps abortion providers and pharmacies that provide birth control from being found correctly, it's a systemic problem in a large corpus of data, and it probably hints at larger problems. Problems that one might expect would be ironed out during a Beta period, when a larger and broader mix of users in a wide range of geographical areas with a wide range of interests uses the system day-in, day-out.
posted by verb at 6:17 AM on December 1, 2011


The similarities are pretty uncanny. I wonder how many of the same people who were wrong about the delistings in the original #amazonfail thread showed up to be wrong here in #amazonfailPartDeux?

Can you point out a specific example where someone was "wrong" here?

It seems like your definition of "wrong" is "not inherently believing everything Apple PR flacks say."

We ask questions. This software bug (which is what it is, at the least) deserves some questions.

Again, please some specific examples of wrongness. I have no idea what #amazonfail is.
posted by mrgrimm at 6:44 AM on December 1, 2011 [1 favorite]


mrgrimm: " I have no idea what #amazonfail is."

Previously on Metafilter.

More.
posted by zarq at 7:21 AM on December 1, 2011 [1 favorite]


The problem isn't that people are attacking Apple, it's that they are deliberately refusing to acknowledge the actual problems that will keep occurring, over and over, as we come to depend more and more on big data.

This is where your argument falls apart. The general public don't know how "big data" works, and frankly, never will. Railing against this is counter-productive. It's a fact. The public sees that this is counter to their expectations and isn't really interested in the technical details of how that problem arose. It's important for Apple to say that this was unintentional, but it's equally important that the next software update fix it.

What the public does know, and has come to depend upon, is that certain topics are treated as high priorities by communication utilities. Siri is arguably the next generation of communications (and search). The existing communications infrastructure prioritizes emergency contact information to make it easy to find. Phone books, at least in Canada, always have fire, police and ambulance contact info in the front pages, and often things like poison control, crisis counseling numbers, spill reporting, stray and animal cruelty reporting, and so on. This is a strong, existing expectation of what a communications and search product should do. So strong, that if the Siri developers don't consider that when designing the application they may find themselves legislated to down the line to do so.

I have a great deal of sympathy with the argument that there was a big blindspot in testing. My phonebook has a rape crisis line number prominently displayed in those first few pages. Siri should have done the equivalent.
posted by bonehead at 7:31 AM on December 1, 2011 [2 favorites]


Can you point out a specific example where someone was "wrong" here?

From this thread specifically:

"This looks to be intentionally blocked, which is awful."

"This looks intentional. And if it is, good. We need to make Americans aware that it can indeed happen here."

"It sure seems intentional, and it kind of disgusts me. Shame, apple. Shame."

"I honestly see no possible way for this to have been an oversight, if only because it's so all-encompassing with regards to women's health."

"The reason that Siri doesn't return abortion information is because it was programmed specifically not to, based on some public relations calculus."



It seems like your definition of "wrong" is "not inherently believing everything Apple PR flacks say."

To reiterate, that's precisely what was said over and over during #amazonfail. Anyone who suggested taking a deep breath and studying what happened was accused of being an Amazon fanboy, a corporate boot-licker, or fundamentally insensitive to GLBT issues. Anyone who offered technical explanations for how such a thing could have happened without any deliberate malice was dismissed as an apologist. Anyone who noted the delisting of products NOT matching the 'horrible bias' narrative was ignored.

This is not about whether the problem should be corrected. It's about whether the problem occurred due to malice or sexism. Numerous posters have suggested that the problem is exclusive to womens' health, when it is not. Numerous posters have claimed that this could only be due to deliberate design, when numerous aspects of machine-learning systems and large user-contributed data sets provide and equally plausible explanation. We've finally reached a point in the thread where the primary argument appears to be whether Apple's engineering team should have explicitly tested searching for abortion clinics in every major city in the United States before releasing the software.
posted by verb at 7:33 AM on December 1, 2011 [4 favorites]


This is where your argument falls apart. The general public don't know how "big data" works, and frankly, never will. Railing against this is counter-productive. It's a fact. The public sees that this is counter to their expectations and isn't really interested in the technical details of how that problem arose.

The public doesn't understand math, and isn't really interested in the technical details of how federal deficits happen. That doesn't mean that I'll stop explaining to my father-in-law that welfare fraud isn't the problem when he rants about lazy blacks stealing his money.

Explaining to people how problems can actually occur does not preclude wanting the problem fixed.
posted by verb at 7:54 AM on December 1, 2011 [1 favorite]


It seems like your definition of "wrong" is "not inherently believing everything Apple PR flacks say."

To reiterate, that's precisely what was said over and over during #amazonfail. Anyone who suggested taking a deep breath and studying what happened was accused of being an Amazon fanboy, a corporate boot-licker, or fundamentally insensitive to GLBT issues. Anyone who offered technical explanations for how such a thing could have happened without any deliberate malice was dismissed as an apologist.


Well, this is not amazonfail, and nobody's really doing that here. Your first four examples are mostly opinions and subjective perceptions ("looks like," "seems," "i think," "i honestly see no way," etc.), and I'd bet dollars to donuts the last one is Pastabagel (yep), b/c well, that's Pastabagel. She/he just apparently likes to make those bold, unsubstantiated pronouncements from time to time. I thought the tone of the discussion here was pretty balanced. *shrug*

from Apple's (very meager) response: "while it can find a lot, it doesn’t always find what you want"

What other searches fail completely with Siri?

Back then, there were tons of similarly authoritative pronouncements about how a specific class of books being delisted from Amazon.com could only possibly be the result of systemic discrimination, deliberate malice, etc.

Aside from the aforementioned PB, I don't see many people making authoritative statements about how it could *only* be a malicious act. I just don't see it. More examples, please.
posted by mrgrimm at 8:29 AM on December 1, 2011


Well, the CPCs are loving this new development.

Perhaps worth pasting the Christian Newswire press release:

Stanton Healthcare, a life-affirming women's medical center that specializes in unexpected pregnancy care, is thrilled by the recent discovery that Siri does not promote or provide abortion information or referrals.

If you ask Siri "I want an abortion" or "Where can I get an abortion" it will tell you "I didn't find any abortion clinics" and in some states, Siri will provide a list of life-affirming pregnancy resource centers in the area.

Brandi Swindell, Founder and President of Stanton Healthcare, states,

"We applaud Apple iPhone's 4S Siri and are thrilled that Siri does not list or refer to abortion clinics. Numerous lives will be saved as a direct result. Siri is setting the standard for all organizations -- no one should ever refer anyone to get an abortion.

"Early feminist and suffragette leader, Elizabeth Cady Stanton, called abortion an evil and said 'When we consider that women are treated as property, it is degrading to women that we should treat our children as property to be disposed of as we see fit.'

"This reaffirms the truth that women have always been at the forefront in speaking out against the violence of abortion and embracing equality and human rights for all. As a woman I'm delighted that Siri is embracing a position that promotes the dignity of women and upholds human rights in the womb.

"It is my hope that Apple remains steadfast and does not cave under any pressure brought by the abortion industry to start marketing abortion clinics." Swindell states. "This is a huge win for women and a significant step in the right direction."


Unintentional or not, Apple, this shit is what you get when you do shit like that. Personally, because of shit like that ^^^, I can't believe it was intentional on a company level, but I don't think the rogue programmer angle is so ridiculous.
posted by mrgrimm at 8:33 AM on December 1, 2011


> Explaining to people how problems can actually occur does not preclude wanting the problem fixed.

By which I'm sure you mean "fixing this particular problem in this particular Apple application" rather than "fixing the limitations of crowdsourced data" because the latter isn't going to happen today or this week or maybe ever. Case in point, where the crowd that generated the data on a given subject has very mixed feelings and polarized opinions about that subject--e.g. abortion, or reproductive rights in general, or even just human reproduction in general--and consequently doesn't much want to talk about it, is it even conceivable that the data (and hence search results from trawling the data) will not reflect the makeup and biases of the crowd? Or even that it should not?

Certainly sub-populations of one flavor or another will have strong opinions about how the search results should be weighted to reflect the facts as they would prefer them, and they may bring whatever pressure to bear that they can, but it's to be expected that those efforts will tend to be mutually self-cancelling.

Searching crowdsourced data has more in common with opinion polling that anything else, and that is what is going to continue to bite.
posted by jfuller at 8:38 AM on December 1, 2011


from Apple's (very meager) response: "while it can find a lot, it doesn’t always find what you want"

What other searches fail completely with Siri?


This is an example of what I'm talking about. Searches for abortion clinics do not "fail completely." In some cities they fail to find clinics depending on the search phrases used. In other cities they fail to match businesses to street addresses. In other cities they work perfectly. One of the first commenters on the Gizmodo thread about this issue noted that they had the same problem with adoption agencies. They could locate abortion clinics in their city without problems, but Siri couldn't find several local adoption agencies.

Pointing out the fact that the failures are regional, often keyword-based, and apply to other kinds of businesses as well as abortion clinics is not an excuse: it's additional information useful in figuring out what is going on. As with #amazonfail, people are asking lots of questions, but they are ignoring the answers because the answers are boring and technical and don't help stoke the fires of cathartic rage.




"Unintentional or not, Apple, this shit is what you get when you do shit like that. Personally, because of shit like that ^^^, I can't believe it was intentional on a company level, but I don't think the rogue programmer angle is so ridiculous."

That is correct. When you don't meticulously test the location-based business data that your product is using on its backend for all of the hot button topics people could get angry about, technically ignorant people will accuse you of deliberate malice. The ignorance is understandable and totally normal -- I don't expect anyone to care about, say, user feedback ranking algorithms. But if they accuse me of "deliberately manipulating results" on the voting and ratings systems that I build, they're still accusing me of something out of ignorance.
posted by verb at 8:56 AM on December 1, 2011 [4 favorites]


mrgrimm: "Unintentional or not, Apple, this shit is what you get when you do shit like that."

If the program listed every single abortion clinic and Planned Parenthood in the country, we'd be seeing a similar statement from NARAL and other pro-choice groups with equivalent wording. In fact, remove "in the womb" from this sentence: "As a woman I'm delighted that Siri is embracing a position that promotes the dignity of women and upholds human rights in the womb." and it could have easily been lifted directly from any number of pro-choice press releases.

I'm 100% pro-choice. I give to pro-choice charities and others that serve the needs of women (and men) who have been abused or worse, like RAINN. But realistically, people who are passionate about an issue can take anything and twist it for their purposes, and whether I happen to wholeheartedly agree with their goals or not, that's not something that's restricted to a particular political side.

It's not really Apple's responsibility to appease everyone on every political issue. Wingnuts gonna wingnut. The company has already said this was not intentional, they're going to look into it and presumably fix it. If they make a mistake, learn from it and prevent it from happening again, I'm not really sure what more can reasonably be expected from them.
posted by zarq at 9:00 AM on December 1, 2011


When you don't meticulously test the location-based business data that your product is using on its backend for all of the hot button topics people could get angry about, technically ignorant people will accuse you of deliberate malice.

I think that there has been a failure here, on the technical side, call it one of imagination or of due diligence, or whatever. It's not inconceivable that people would use Siri for emergency or priority information. Indeed, similar services (e.g. OnStar) promote the idea that that's what these services are really good at. As I pointed out above, there are decent examples of what needs to be prioritized, both in prior art, like the phone book or in even the briefest consultation with as emergency-management organization (they all have long call lists and all know each others' numbers)

That failure really is at the core of the problem here. This isn't just about finding the nearest pizza parlour, which Siri presumably does ok at, but about finding relevant information quickly. Emergency, crisis and health information shouldn't have been a huge surprise. Apple did drop the ball here. It's fixable sure, but looks really stupid from the outside.
posted by bonehead at 9:08 AM on December 1, 2011 [1 favorite]


verb: " This is an example of what I'm talking about. Searches for abortion clinics do not "fail completely." In some cities they fail to find clinics depending on the search phrases used. In other cities they fail to match businesses to street addresses. In other cities they work perfectly. One of the first commenters on the Gizmodo thread about this issue noted that they had the same problem with adoption agencies. They could locate abortion clinics in their city without problems, but Siri couldn't find several local adoption agencies."

Some searches fail completely. Some do not. Some are truly problematic. We have apparent, multiple reports of search results that do not include abortion providers, but do include scammy pro-life/anti-choice "clinics" that shame women who seek out an abortion and try to talk them out of having one. Some of the people (including someone here in this thread!) who report that the program offers them abortion clinics in a search, also subsequently report that the clinics aren't what they first appeared to be. They're pro-life "pregnancy crisis centers"

Of course, this is a search engine problem that is not limited to Siri.
posted by zarq at 9:12 AM on December 1, 2011


Aside from the aforementioned PB, I don't see many people making authoritative statements about how it could *only* be a malicious act. I just don't see it. More examples, please.

Ffs, there's a comment in this thread that compares what happened here with what the Nazis did to social undesirables.

And that's just the tip of the iceberg—verb pointed out several other examples. You don't need more examples of how a number of you are assigning malice: you just need to actually bother to read the comments in this very thread, where examples are plentiful. #amazonfailpartdeux
posted by Blazecock Pileon at 9:45 AM on December 1, 2011 [1 favorite]


If the program listed every single abortion clinic and Planned Parenthood in the country, we'd be seeing a similar statement from NARAL and other pro-choice groups with equivalent wording.

I don't buy that (at all), but I understand your point, i.e. wingnuts gonna wingnut.
posted by mrgrimm at 9:46 AM on December 1, 2011


I don't think the rogue programmer angle is so ridiculous.

It's a conspiracy! You guys are almost as bad as 9/11 truthers, I swear.
posted by Blazecock Pileon at 9:48 AM on December 1, 2011


When you don't meticulously test the location-based business data that your product is using on its backend for all of the hot button topics people could get angry about, technically ignorant people will accuse you of deliberate malice.

After reading everything, this floated to the top.
posted by Avenger50 at 9:54 AM on December 1, 2011 [2 favorites]


It's a conspiracy! You guys are almost as bad as 9/11 truthers, I swear.

YES.

File this whole thing under "wingnuts," right or left, it doesn't matter.
posted by Avenger50 at 9:54 AM on December 1, 2011 [3 favorites]


I don't think the rogue programmer angle is so ridiculous.

It's a conspiracy!


One person does not make a conspiracy. In fact, it's the very opposite of a conspiracy.

Censorship, conspiracy ... it's like 10th grade vocab class in here.
posted by mrgrimm at 9:59 AM on December 1, 2011


Blazecock Pileon: " Ffs, there's a comment in this thread that compares what happened here with what the Nazis did to social undesirables. "

Perhaps I'm wrong, but that looks very much like sarcasm to me.
posted by zarq at 10:10 AM on December 1, 2011


One person does not make a conspiracy

The insinuation is that the bosses are covering up the work of one malicious programmer, in contradiction of an official statement that this is about beta software that needs fixing. Your conspiracy theory is pretty silly, not least of all because it is not bolstered by known facts or company history — quoting the opinions of a pro-life organization is not the basis for a factual, reasoned argument.
posted by Blazecock Pileon at 10:15 AM on December 1, 2011


Blazecock Pileon: "The insinuation is that the bosses are covering up the work of one malicious programmer, in contradiction of an official statement that this is about beta software that needs fixing."

He said 'rogue programmer.' I'm pretty sure that means he's referring to a single programmer who may have inserted code into the program, without his bosses / project managers knowing of it. That seems to me to be the very definition of 'rogue programmer.'

I don't think either scenario is terribly realistic, fwiw. But what you're accusing him of does not seem to mesh with what he is saying.
posted by zarq at 12:04 PM on December 1, 2011


If there's an algorithmic or metadata problem that keeps abortion providers and pharmacies that provide birth control from being found correctly, it's a systemic problem in a large corpus of data, and it probably hints at larger problems.

I agree. However, given that there is a larger problem, how is Apple supposed to fix it? They can't just go out and manipulate the entire corpus of data to make it fit their algorithm. I've been told it would be "terrible" to change the algorithm to include businesses that don't incorporate exact search terms in their reviews, and I've also been told that "if I were in their shoes I wouldn't manually add it no matter how much you complain". So this problems-with-data argument starts to look like a spreading of hands: oh, sorry, that's just the way it works because "people are embarrassed to talk about their abortion". If you want this search engine to work properly then you need to create a grassroots campaign to add fake local reviews to abortion clinics across the country.

That seems less reasonable than adding a special case, and I hate special cases. And even after all this talk about how secret and sacrosanct the algorithm is, I expect that Apple will have a pretty quick fix for this one...

Besides, Siri can find hardware stores with zero Yelp reviews by sort-of-name, but can't find abortion providers with zero Yelp reviews by exact name. There's probably an algorithm/data explanation for this, but failing to find businesses by exact name is a problem no matter what they are. In a lot of these cases Siri's algorithm seems to be worse than it would be if it simply passed Yelp's results on to the user, which is why I don't really get the "this is the price we pay for big data" argument. If it's not the price we pay to use Google or Yelp or (shudder) Ask Jeeves, then what's with Siri?

I'd think "ha ha, whoops, seems like our algorithm just doesn't find a ton of things even though the underlying engine does!" isn't much more impressive than deliberate malice, despite being much less fun.

We've finally reached a point in the thread where the primary argument appears to be whether Apple's engineering team should have explicitly tested searching for abortion clinics in every major city in the United States before releasing the software.

That's not my argument, and I think you know it's not. Searching for abortion (or birth control, etc) in one or two major cities would probably have revealed this problem. I'm sure Siri was tested in major cities, and I'd be shocked if NYC wasn't among them... so one search might have prevented a huge PR mess. Adding "controversial" keywords to existing tests doesn't seem unreasonable to me (and you can bet that Apple will be searching for it next time around).
posted by vorfeed at 12:08 PM on December 1, 2011 [2 favorites]


This thread is super interesting! I've learned a lot about how Siri thinks and the programming required to make it work. That being said, as a consumer, I don't need to know the nitty-gritty details of why it has limitations, I just need to know that it does. It's helpful and interesting to know, but if I'm a consumer who needs to find an abortion provider/birth control/whatever, the product does not always do what it is proported to do. In this case, Siri's limitations made me (and a lot of other people) angry. That being said, it looks like they're working on fixing it. So, problem solved.
posted by Weeping_angel at 12:30 PM on December 1, 2011


He said 'rogue programmer.' I'm pretty sure that means he's referring to a single programmer who may have inserted code into the program, without his bosses / project managers knowing of it. That seems to me to be the very definition of 'rogue programmer.'

The idea doesn't even make sense. One can use Siri to find abortion clinics in NYC, Denver and other godless, heathen cities in the US. That's a pretty incompetent display of skills for a secret, pro-life, computer-programming cabal. Apple doesn't generally hire shitty programmers, not even rogue ones that they apparently conspire to cover for.

I'll ask again that some of you start from first principles:

• Does it make sense to believe Apple wants to keep women impregnated? What is in the makeup of their staff or in their history as a decades-long, progressive company that would lead observers to immediately make the leap to assume that they believe women are to be treated as baby machines? Does leaping to that conclusion without facts make any sense at all, based on what we already know?

• Siri finds women's health resources in some locations, but not others. If there is a deliberate campaign of holding back this information in order to push a pro-life agenda, why in some locations, but not others? Does that really make any sense? If there is a deliberate campaign to hold back this information, why does Apple say publicly that it will fix this by adding more information? Does that make rational sense, when you are someone trying to keep women ignorant of their options, to not only do such a horrible job of it, but to say that you'll fix the issue when it's brought to your attention?

Software is a complex system of interrelated, moving parts. We've had Amazon commit a similar technological mistake, and yet it became clear that there was no conspiracy, no cabal, no rogue programmers. Why jump to the same wrong conclusions twice?
posted by Blazecock Pileon at 12:59 PM on December 1, 2011


vorfeed: "I'm sure Siri was tested in major cities, and I'd be shocked if NYC wasn't among them... so one search might have prevented a huge PR mess."

Is this really a huge mess? I mean, from a PR perspective it doesn't seem like one. It's hard to tell, but I suspect this will blow over with few long-term ramifications on their company image.

The ACLU's initial comment was "Although it isn't clear that Apple is intentionally trying to promote an anti-choice agenda, it is distressing that Siri can point you to Viagra, but not the Pill, or help you find an escort, but not an abortion clinic," the group wrote in a blog post Wednesday. "We're confident that the developers at Apple want to provide iPhone users with accurate information."

The company's response was to release a statement relatively quickly and explain: “These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.”

The ACLU seems to think Apple is operating in good faith. They could have gone for the jugular. Apple says they're going to improve Siri's results and that the omission wasn't intentional. They could easily have passed the buck.
posted by zarq at 1:07 PM on December 1, 2011 [1 favorite]


Blazecock Pileon: "The idea doesn't even make sense."

And I'm not defending the idea. In fact, I even said that I don't think it's a terribly realistic scenario.

In at least two comments, he referred to a single "rogue programmer," and said the situation could be the work of one person. You replied to his comment by saying: "The insinuation is that the bosses are covering up the work of one malicious programmer, in contradiction of an official statement that this is about beta software that needs fixing. Your conspiracy theory is pretty silly, not least of all because it is not bolstered by known facts or company history — quoting the opinions of a pro-life organization is not the basis for a factual, reasoned argument.

Perhaps I missed it, but I haven't seen mrgrimm refer to the Apple company cover-up you're accusing him of theorizing. Hence my comment.

One can use Siri to find abortion clinics in NYC, Denver and other godless, heathen cities in the US.

Mostly, it seems to be more likely to find pro-life crisis pregnancy centers than actual abortion-providers. But that's just my impression based on the links I've seen posted in this thread. It's perfectly possible that the examples being shown us on that handful of blogs are simply outliers.

Please note (you seem to have missed it) that I commented upthread that I think this is probably a data issue and not a problem restricted to Siri.

That's a pretty incompetent display of skills for a secret, pro-life, computer-programming cabal.

First of all, cabal is your word. You're inventing something wholesale here that neither of us seem to have either mentioned, discussed or endorsed. At least, I know for sure *I* haven't.

Second of all, I've already said to you that I don't think the idea is realistic. Are you really going to try to engage me as if I am endorsing mrgrimm's comment, when I've already unambiguously said I don't?

What sort of answer could you possibly expect me to give you here, other than to simply repeat myself?
posted by zarq at 1:39 PM on December 1, 2011


Is this really a huge mess? I mean, from a PR perspective it doesn't seem like one. It's hard to tell, but I suspect this will blow over with few long-term ramifications on their company image.

Probably so, but at the same time, look how many people remember #amazonfail. As a large corporation, it just seems like it's worth avoiding having pro-life groups put out press releases lauding you for "embracing a position that promotes the dignity of women and upholds human rights in the womb."
posted by vorfeed at 1:49 PM on December 1, 2011


vorfeed: " Probably so, but at the same time, look how many people remember #amazonfail.

Good point.

As a large corporation, it just seems like it's worth avoiding having pro-life groups put out press releases lauding you for "embracing a position that promotes the dignity of women and upholds human rights in the womb.""

This is true. But as I said earlier, wingnuts gonna wingnut.
posted by zarq at 2:06 PM on December 1, 2011


Probably so, but at the same time, look how many people remember #amazonfail.

Well, one of the consensus conclusions to come out of #amazonfail was that weird big-data-fall-down-go-boom problems are not as easy to predict and avoid as we'd like, and that the BIGGEST fail they made was the utterly incompetent PR response that followed the initial burst of publicity.

It's a bit convoluted to track it all down, but the timeline went something like:
  1. Amazon does some delisting of 'adult products' (ie, they don't show up in searches but you can link to them directly). Amazon automated email response set up to deal with questions about the policy.
  2. Months later, a metadata tagging problem results in something like 75,000 GLBT, disability sexuality, and other assorted hot button books being delisted the Friday afternoon before Easter weekend.
  3. People start noticing, try to find out why, and start pointing at earlier instances of the automated 'Adult content delisting policy' email as an explanation.
  4. SWEET BEJEEZUS TWITTER
  5. 2-3 days of mixed information, denials by amazon employees speaking unofficially for the company, mounting outrage, WSJ articles, local newspaper articles, calls for boycotts, outraged cancellations of customer accounts, etc.
  6. On Monday, Amazon apparently notices the outrage, says, "It was a glitch!" and relists the products.
The exact chronology was a little fuzzy, but that's roughly the cycle. Apple responded less than a day after the problem was spotted, said unambiguously that it was unintentional and would be corrected, and clarified their position on the matter. In that sense, they did learn from #amazonfail.
posted by verb at 2:28 PM on December 1, 2011 [5 favorites]


Siri Is Dumb. There, We Said It. -- The flap over Siri's apparent reluctance to point users to abortion services makes for great political theater. It's also an indictment of our understanding of how technology works.
posted by ericb at 2:29 PM on December 1, 2011 [1 favorite]


Consider the current kerfuffle. This is simplifying things a bit, but the gist of this story is that Siri is getting hung up on a word, "abortion," because organizations that actually offer abortion services tend not to use the word as much as anti-abortion organizations do. So when Siri goes looking for where to get an "abortion" in the digital wordscape of the Internet, lo and behold, it returns addresses for Crisis Pregnancy Centers rather than Planned Parenthood.

This is actually the best explanation that I have heard. If it was already posted in this thread, I apologize for missing it.
posted by Splunge at 5:24 PM on December 1, 2011 [1 favorite]




-book’em-

I was nearly crying reading that. Maybe I’m easily amused.
posted by bongo_x at 12:43 PM on December 6, 2011 [1 favorite]




Why the problem with Siri matters
posted by mrgrimm at 11:22 AM on December 7, 2011


octothorpe: "I'm impressed with it understanding the words it did. like Allegheny. I can't even pronounce that without looking severely constipated.

The real test would be it understanding Monongahela.
"

Heh. As a surprised exclmation of incredulity, I have been known to say "Holy Monongahela!"
posted by Samizdata at 1:58 PM on December 9, 2011


mrgrimm: "Also, if you used this Siri more often, you wouldn't need THAT Siri for abortion advice.

You might if you were raped.
"

Well, no, as I am a man. Since we are going to be misreading the point of comments...
posted by Samizdata at 2:23 PM on December 9, 2011


Also, I find it disturbing that Siri apparently does not list one of the best ways to dispose of a body - pig farms. They'll even take care of the bones for you.
posted by Samizdata at 5:36 PM on December 9, 2011 [1 favorite]


« Older "Carrier IQ is used to understand what problems...   |   Other Minds Newer »


This thread has been archived and is closed to new comments