Google detects child porn images in user's gmail, leading to arrest
August 4, 2014 10:08 AM   Subscribe

Google's updated Terms of Service state explicitly that the company automatically analyzes all email content to create targeted advertising. This case, in which Google identified child porn images in a user's email message, leading to his arrest, seems to be one of the first known instances of Google monitoring personal gmail accounts for illegal activity. The arrest raises questions over the privacy of personal email and Google's role in policing the web.

According to local TV station KHOU,
Police say Google detected explicit images of a young girl in an email that John Henry Skillern was sending to a friend; the company then alerted authorities.

"He was trying to get around getting caught, he was trying to keep it inside his email," said Detective David Nettles of the Houston Metro Internet Crimes Against Children Taskforce. "I can't see that information, I can't see that photo, but Google can."

Skillern is a registered sex offender who was convicted of sexually assaulting an 8 year old boy in 1994.

... After obtaining a search warrant, investigators said they found child porn on Skillern's phone and tablet device. They also found text messages and e-mails where he talked about his interest in children."
As the New York Times reported in April 2014:
Google updated its terms of service on Monday, informing users that their incoming and outgoing emails are automatically analyzed by software to create targeted ads. The revisions more explicitly detail the manner in which Google software scans users’ emails, an unpopular practice that has been at the heart of litigation. Users of Google’s Gmail email service have accused the company of violating federal and state privacy and wiretapping laws by scanning their messages. Google has argued that users implicitly consented to its activity. A Google spokesman said that the changes "will give people even greater clarity and are based on feedback we’ve received over the last few months." Google’s updated terms of service added a paragraph stating that "our automated systems analyze your content," including emails, to provide personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. "This analysis occurs as the content is sent, received, and when it is stored," the terms state.
Two weeks later, Google announced that it would stop scanning the accounts of Google Apps customers with Business, Government and legacy accounts for the free version as well as students' Apps for Education accounts, but with no change for personal email messages. (previously)

"Online service providers like Google are required under federal and many states' laws to report child pornography when they find it," attorney Chris Jay Hoofnagle, director of information privacy programs at the Berkeley Center for Law & Technology tells Business Insider. However they are under no obligation to go out and look for it, Hoofnagle says."

Via CNET, "For Google, it seems that the public good in attempting to eradicate child pornography takes precedence over what some might consider private communication."
posted by argonauta (75 comments total) 12 users marked this as a favorite
 
Oof. Classic case of "whoever wins, we lose."

Gonna have to side with the child pornographers here. I can't really see an argument they're [Google] using that would be any different if we were talking about drugs or terrorism.

pre-emptively, Metafilter: gonna have to side with the child pornographers.
posted by Lemurrhea at 10:19 AM on August 4, 2014 [19 favorites]


Someone I know who worked for Google once half-chuckled and said he couldn't believe how many people are convinced their data within the Google cloud is actually private. "Don't trust Google with anything you wouldn't trust any other company with." This was back in like 2008, so I imagine things have gotten worse since. You don't actually have any safe secure spaces on the internet, period. Probably even with Tor-like software. How you proceed is up to you.
posted by naju at 10:21 AM on August 4, 2014 [1 favorite]


that, of course, doesn't mean they're wrong. They would, IMO, be wrong to apply that argument to drugs or terrorism, because those things aren't problems really. But I have a hard time being up in arms about this when the outcome so far has been positive.
posted by MisantropicPainforest at 10:21 AM on August 4, 2014


I honestly have trouble seeing how Google is in the wrong here - even for drugs or terrorism. They make it extremely clear exactly what they'll be doing with your email. If you don't like it, you have every right to use a different email provider, or just encrypt your email text before you send it (which is probably good practice anyway). If Gmail were the only email provider in the world it would be a different story, but that's far from the case.
posted by Itaxpica at 10:28 AM on August 4, 2014 [8 favorites]


Isn't the general level of privacy for all unencrypted email, regardless of who your service provider is, the same as that of a postcard? Pretty much anybody in the chain of transmission, which would normally include third parties unknown to sender or recipient for mail not being sent within one domain, can scan the whole thing if they're inclined to.
posted by strangely stunted trees at 10:30 AM on August 4, 2014 [4 favorites]


Gonna have to side with the child pornographers here.

At the most trivial level, you can take a SHA-1 hash of an image (or any other piece of data) and compare it to another list of know SHA-1 hashes to see if there are hits. This is how you can compare chunks of data in a completely automated way, without actually looking at them, and with a nearly nonexistent collision rate. Law enforcement has a list, pictures that aren't on The List get ignored, and pictures that are get a big honking red flag.

(That's the most trivial approach possible, incidentally, and it presents a reasonable balance between privacy you don't actually have and law enforcement access that could be a lot more overbearing. For what it's worth, I know that a bunch of really, really smart people have been working on this for a long time.)
posted by mhoye at 10:30 AM on August 4, 2014 [3 favorites]


I think child abuse and child pornography are in a very narrow category (along with human trafficking) where the crimes are so out of bounds of anything remotely acceptable in our society that this kind of intrusion (by a private company I might add, not the government) is completely acceptable to stop them. That's where slippery slope arguments miss the point a little I think. Yeah, this isn't getting into the messy morally grey world of drugs or terrorism. Child pornography is an unambiguous evil, and that matters.
posted by naju at 10:31 AM on August 4, 2014 [1 favorite]


My only reservation is families taking completely harmless photos of their children, and that setting off alarms within this monitoring system and causing trouble. But that hasn't happened yet.
posted by naju at 10:32 AM on August 4, 2014 [1 favorite]


Oh yes it has, naju.

On preview: some people think that drugs or terrorism is unambiguous evil.
posted by Melismata at 10:40 AM on August 4, 2014 [9 favorites]


My only reservation is families taking completely harmless photos of their children, and that setting off alarms within this monitoring system and causing trouble. But that hasn't happened yet.
If the system works by consulting a list of cryptographic checksums of known child pornography images, it won't be triggered by family photos (unless the FBI or some other agency somehow puts them on the list.) However, it also makes the recognition very fragile and almost trivial to defeat (open the image in a program, change a few pixels or save at a different compression level, redistribute without fear of recognition) so I wouldn't expect this to be the last word in countermeasures.
posted by Nerd of the North at 10:42 AM on August 4, 2014 [1 favorite]


And some people think that drugs or terrorism is unambiguous evil.

This is a silly argument. Some people think X is an unambiguous evil. However, no reasonable person thinks child pornography is not an unambiguous evil.
posted by MisantropicPainforest at 10:43 AM on August 4, 2014 [1 favorite]


Melismata:
But that hasn't happened yet.
Oh yes it has, naju.
I presume naju meant that "it hasn't happened yet with this automated reporting system." naju's concern about the possibility strongly implies familiarity with existing incidents that have occurred based on reports from photo processors, etc.
posted by Nerd of the North at 10:44 AM on August 4, 2014 [1 favorite]


Email is terrible. Please everyone stop using this crappy thing.
posted by humanfont at 10:45 AM on August 4, 2014 [1 favorite]


It's disappointing journalism that apparently no reporter thought to ask what happened to the email recipient, and whether that person isn't also guilty of possessing child pornography, and what Google might have to say about that.

But I have a hard time being up in arms about this when the outcome so far has been positive.

That is exactly why test cases are cherry-picked.
posted by cribcage at 10:47 AM on August 4, 2014 [14 favorites]


It is unequivocally bad that Google is looking at content of private email messages and reporting illegal content to the police.

They will soon move on to reporting things aside from child pornography, in part because the early cases involving child pornography garnered no serious opposition because, well, child pornography.
posted by mediareport at 10:52 AM on August 4, 2014 [33 favorites]


That is a classic slippery slope argument. And if that happens, then I'll have an issue with it. Until then, no.
posted by MisantropicPainforest at 10:55 AM on August 4, 2014 [6 favorites]


OK, hang on. "No reasonable person thinks child pornography is not an unambiguous evil". But it's somehow not reasonable to point out that "some people think that drugs or terrorism is unambiguous evil".

So passing around photographs is an unambiguous evil, but intentionally blowing up random people without regard for their personal beliefs or actions is not an unambiguous evil? And there's no reasonable position that could say that helping somebody become addicted to any drug is worse for that person than putting them in child pornography? Not to mention all the other amazingly depraved things people do to one another?

Seriously, if you think child pornography is the most evil thing in the world, that it's different in kind from the other evil that's out there, then I have to suspect you haven't thought about what the world has to offer.

If were OK to go after child pornography with this sort of thing, then it would have to be OK to go after terrorism the same way. That's assuming the tactic would work, which of course it would not; it's just not technically possible.

By the way, I'm surprised if this is really the first Gmail user Google has turned in. Child porn hash lists have been available to large service providers for years.
posted by Hizonner at 10:56 AM on August 4, 2014 [4 favorites]


It is unequivocally bad that Google is looking at content of private email messages and reporting illegal content to the police.

Why? Or to put it another way, who ever said email (especially unencrypted email), passed through the servers of a for-profit corporation, is private?
posted by Itaxpica at 10:58 AM on August 4, 2014 [3 favorites]


It used to be, but now it's not. That's unequivocally bad.
posted by mediareport at 10:59 AM on August 4, 2014 [1 favorite]


They state in the article "We have built technology that trawls other platforms for known images of child sex abuse. We can then quickly remove them and report their existence to the authorities."

this implies, to me, that they are using the same image-matching algorithm they use in their reverse image search, which would mean you would get some rare false-positives, but simply re-compressing the image so it has a new hash would not defeat it.
posted by idiopath at 10:59 AM on August 4, 2014 [1 favorite]


Courts, including the Supreme Court, have consistently placed child pornography within a special category. For First Amendment purposes, investigation purposes and the like. I'm not saying this as my own opinion. Our society and legal standards have spoken and there's been no pushback.
posted by naju at 11:01 AM on August 4, 2014


I'm very surprised that Google did this. A little skeptical too, I sure wish they'd comment themselves so we had more to go on than a Houston cop's statement. Given the amount of press attention to this story I imagine they will have to comment, at least in broad terms.

It just seems like bad business for Google. They're already hit left and right with complaints about privacy, about the content scanning in Gmail, about being Big Brother. I'm surprised they'd actually go so far as to provide this example of actual snooping.

I also wonder if Google doesn't feel it has a legal obligation to report a case. Google is on the record saying they can identify child porn. So if they're doing image hashing and content scanning of email, there's an argument that they actually have direct evidence of a child pornography crime. Are they then legally obligated to report?
posted by Nelson at 11:04 AM on August 4, 2014 [1 favorite]


>It used to be, but now it's not. That's unequivocally bad.

It didn't really, people just didn't think about it. I remember being taught in school that "anything in an email is like a post card, visible by anyone with a copy of it"

The paranoid have been encrypting their email for years, just like you and I should probably get around to doing.
posted by unknownmosquito at 11:08 AM on August 4, 2014 [3 favorites]


That is a classic slippery slope argument.

Yes, it is. I'm not sure what to tell you here. The slippery slope abides as a form of argument because, notwithstanding that it can be misused like anything else, it's valid. One of the most harmful things the Internet ever did was to compile lists of argument types so that people who don't understand much about rhetoric can identify arguments by name and thereby believe they understand the arguments. It is exactly identical to a small child memorizing the names of dinosaurs.

Our society and legal standards have spoken and there's been no pushback.

Those comments you're responding to? That's pushback.
posted by cribcage at 11:09 AM on August 4, 2014 [28 favorites]


If they have actual knowledge of specific child porn, yes, they are legally obligated to report in many places, including the US.

Furthermore, child porn laws tend to be draconian about strict liability. Google could be argued to be guilty of possessing child porn even if Google doesn't know about it, because it's on Google's servers. And, even if they don't know about specific instances, they definitely do know there's some on their server somewhere, which you might be able to turn into a "knowing" possession charge with the right judge in the right jurisdiction.

As a large corporation that's obviously not directly encouraging child porn, Google and similar outfits tend to get a pass. But they have an incentive to work with law enforcement, just to make sure that nobody decides to go out and claim they're actually in criminal possesion themselves.
posted by Hizonner at 11:10 AM on August 4, 2014 [2 favorites]


However, no reasonable person thinks child pornography is not an unambiguous evil.

I'm reasonable. Every evil is ambiguous, if in no other way, then at its edges. If you don't believe me, consider your own assertion that there's something ambiguous about terrorism as an evil.
posted by ftm at 11:21 AM on August 4, 2014 [4 favorites]


I've just been reading elsewhere on Metafilter about how email is exactly as private as a postcard. Anybody who has access to any of the computers through which your email is routed can see anything you send.

If something like this was sent through the US postal service on a postcard, and the guy sending it unsurprisingly got caught, nobody would see a slippery slope with government surveillance impinging on our personal freedoms. People would just be relieved, and wish that everybody doing this kind of shit could be stopped so easily.
posted by Sing Or Swim at 11:24 AM on August 4, 2014 [7 favorites]


Email is private?

Apparently pedophiles are unable to encrypt properly.
posted by Mental Wimp at 11:25 AM on August 4, 2014 [1 favorite]


If Google would simply say, "The only illegal activity we are currently searching for and reporting is child abuse images appearing anywhere on our network, and we will announce in advance any other activity we start searching for and reporting in the future," I'd be happy.
posted by mediareport at 11:26 AM on August 4, 2014 [9 favorites]


From a technical point of view, email is really a lot more like an envelope than like a postcard. You have to take active steps to see the content, even if you're a sysadmin. But I think both analogies are more misleading than enlightening.

And, yeah, people in general are bad at cryptography. Not just pedophiles. For that matter, more people should be running their own mail servers, and you don't see that brought up.
posted by Hizonner at 11:27 AM on August 4, 2014


Searching for and reporting *known images of child pornography* in email attachments? I'm cool with that.
posted by edheil at 11:30 AM on August 4, 2014 [3 favorites]


Those comments you're responding to? That's pushback.

"It's been long established and agreed upon in all levels of society and our judicial system that child pornography is especially harmful, and there's a strong correlation with child sexual abuse, so it should have less constitutional protection than other forms of expression. Besides, this is a private company so it's not a constitutional issue to begin with."

"Wait, did you read this Metafilter thread..."

"Woah. Well in that case, let's reevaluate everything."
posted by naju at 11:32 AM on August 4, 2014 [1 favorite]


This is going to make Google cloud services an even tougher sell for businesses and government agencies. Come use our great services, where we probably won't share your internal documents with the police. Probably.
posted by miyabo at 11:35 AM on August 4, 2014 [1 favorite]


Searching for and reporting *known images of child pornography* in email attachments? I'm cool with that.

Who keeps the database? Someone inviolable and trustworthy, no doubt?
posted by ftm at 11:36 AM on August 4, 2014 [1 favorite]


Searching for and reporting *known images of child pornography* in email attachments?

What about when they accidentally add an innocuous image to the list, and suddenly you're defending in public emails that you didn't want to share in the first place?
posted by miyabo at 11:37 AM on August 4, 2014 [2 favorites]


Why should Google be doing this and not ISPs? Why not have some firmware running in every CPU that scans for this stuff? These aren't slippery-slope arguments if people are arguing "well, when it comes to child pornography, all inspections are fine." And if there is some principled distinction between Comcast and Google, or Google and Apple, what is it? When the technology allows inspection, where can't the "child pornography" banner take you?
posted by chortly at 11:46 AM on August 4, 2014 [2 favorites]


The first time Google auto-detects Aunt Rose sending Gram and Gramps pictures of her toddlers in the bath is the first time this system will come under fire and I hope there's enough Internet outrage then to-- aw, hell, I'm filing it in the "Surely This" cabinet.
posted by Spatch at 11:47 AM on August 4, 2014 [1 favorite]


I don't like this because I like my search and seizure that leads to prosecution to be conducted under a warrant. Even if the search was contracted out to a private company.

I get that the reasonable expectation of privacy is reduced for email relative to, say, phone conversations, but suspicion-less searches should still be confined to the information that's normally collected for normal business reasons like billing. IE, more like a pen register than a phone conversation. Since google has to actively do stuff to do this search that they would not normally do for regular business purposes, that pretty much fails the no-reasonable-expectation-of-privacy test to me. To me, this is just a police wiretap that hasn't been authorized by a judge and that isn't supervised by anyone outside law enforcement and their de-facto employees.
posted by ROU_Xenophobe at 11:49 AM on August 4, 2014 [18 favorites]


We, meaning our western governments, ban child porn images. And we're mostly good with that. God knows I'm happy enough to see someone with a sexual interest in 8-year-olds taken off the street.

But this same tech works to ID, say, images of Tiananmen Square, too, right? And the same logic that it's a serious crime works for that government right? And Google's argument is basically that they can do what they like, and it applies equally to these as to kiddie porn.

So why would they stop at any point along the slope? Why stop at terrorism and Tiananmen? Maybe snatches of copyright songs can automatically generate lawsuits. Hey, it's a crime.
posted by tyllwin at 11:59 AM on August 4, 2014


There's nothing like a child porn bust to legitimize your new "we look at everything" terms of use. Why, it's almost as if an executive order came down to...I dunno...dig deep and find something to use to show Google as white-hat-wearing citizens.

Of course, one would have to be deeply cynical and/or paranoid to think this...
posted by Thorzdad at 11:59 AM on August 4, 2014 [1 favorite]


If it's free, you are the product.
posted by Pudhoho at 12:08 PM on August 4, 2014 [1 favorite]


From an economic point of view, it makes tons of sense for gmail to be scanning attachments to identify known files. Let's say I send out my non-porny vacation photos to 45 gmail addresses. Google very much wants to recognize this and avoid storing 45 different copies of the same files which may be quite large.

Duplicate detection is kind of similar to indexing, which is something Google loves doing to things.

(It does not make sense for them to do fuzzy reverse-image-search type analysis though. If I send two people two subtly different versions of the same image (e.g., they're watermarked), it would absolutely be inappropriate for gmail to merge them.)

Anyway due to image search, and safe search, and compliance with various laws, I'd imagine they have a database of which images from the web are contraband in which ways. Since it's common that email attachments are also present on the web, it may be the case that "detecting" this is just a question of cross referencing two lists. If so, it could reasonably be said that they "knew about" this, and the question shifts from "are they actively looking for it" to "are they actively ignoring it".

I would not be surprised if Google doesn't consider this to be voluntary.
posted by aubilenon at 12:50 PM on August 4, 2014


The first time Google auto-detects Aunt Rose sending Gram and Gramps pictures of her toddlers in the bath is the first time this system will come under fire and I hope there's enough Internet outrage then to-- aw, hell, I'm filing it in the "Surely This" cabinet.

This was my exact thought. I send photos of my son to his grandparents all the time. What if one of them gets flagged because he's sitting on the toilet or playing in the tub, and I suddenly get a knock on my door?

I'm having little difficulty imagining that scenario.
posted by Fleebnork at 12:50 PM on August 4, 2014


...he crimes are so out of bounds of anything remotely acceptable in our society that this kind of intrusion (by a private company I might add, not the government) is completely acceptable to stop them. That's where slippery slope arguments miss the point a little I think

naju:

Here in the UK, the entire Internet is filtered. Many health sites are filtered, sites on suicide prevention, sites on sex and sex education. They should not be. Heck, even Wikipedia has been blocked.

So, the slippery slope has already happened over here. Whenever this filtering was debated, the proponents always brought out the same argument. You know what it is. You're making it now yourself in this thread.
posted by vacapinta at 12:51 PM on August 4, 2014 [17 favorites]


The email-as-postcard analogy that's often bandied about is good in some situations, but it's not an exact equivalence, and in some situations it's doesn't really make much sense. For example, it's true that your someone at a postal facility could see what you wrote, and it's true that Google could see what you wrote. But Google has to intentionally attempt to do it.

Preemptive: I know, I know, they have to read the bytes from the receiver in order to write them to the receiver. But they have to choose to analyze them.
posted by Flunkie at 12:55 PM on August 4, 2014


I mean, if some postal worker happened to notice that a certain postcard detailed a credible plot to blow up Bora Bora, sure, it should be reported to the authorities. But does the fact that that's possible imply that the USPS should actively and intentionally be reading all postcards, just in case one of them has a credible plot to blow up Bora Bora?
posted by Flunkie at 1:00 PM on August 4, 2014 [2 favorites]


I send photos of my son to his grandparents all the time. What if one of them gets flagged...

To clarify, it's my understanding from reading—although I'm not involved with this case at all—that this was not a newly created image that Google identified as child pornography. It was a file that matched a preexisting list for images that had been identified as child pornography. That's not uncommon, and it's one of the major issues in this field of law enforcement (and tort law). Once images get into the wild, they can be traded back and forth for years.
posted by cribcage at 1:00 PM on August 4, 2014 [1 favorite]


In a previous work life, I wrote software to do this and so much more with the information users were putting into our chat systems. In the course of a few years, we flagged numerous users for law enforcement attention. Everything from groomers and pedos actively engaging children to bomb threats against the President (a few of those). Most of the time, we engaged in gently correcting behavior, but we also sent out welfare checks on people who indicated that they were being abused or self-harming. Once we got local law enforcement involved in time to save a teenage girl who'd taken a bunch of pills in an attempt to kill herself.

I'm still proud of that work and I'm an ardent believer in privacy rights. Very few corporations are going to want to have bad actors and bad traffic associated with their brand - so if you want privacy, you better take it into your own hands cause there's no upside for a public company to perfectly transmit everything you send without some automation bad guy stuff.
posted by drewbage1847 at 1:17 PM on August 4, 2014 [2 favorites]


Anyone dumb enough to use Google and expect privacy deserves what comes to them. With the possible exception of the NSA, no one is worse than Google when it comes to wringing every bit of personal information from data.
posted by five fresh fish at 1:37 PM on August 4, 2014 [1 favorite]


Gmail accounts get compromised all the time. So this opens up a really convenient vector for script kiddies to send known child porn images from hacked accounts for the purpose of putting the accountholder through legal hell...
posted by qxntpqbbbqxl at 1:54 PM on August 4, 2014 [6 favorites]


I'm very surprised that Google did this. [..] It just seems like bad business for Google. They're already hit left and right with complaints about privacy, about the content scanning in Gmail, about being Big Brother. I'm surprised they'd actually go so far as to provide this example of actual snooping.

This case is a really great way to go public with their automated snooping without triggering a big backlash. Then when it becomes public that they are reporting a bunch of other stuff to authorities, people will go "meh, everyone already knows they do that."

Google is being very smart and very sneaky.
posted by ryanrs at 2:08 PM on August 4, 2014 [6 favorites]


Yes, it is. I'm not sure what to tell you here. The slippery slope abides as a form of argument because, notwithstanding that it can be misused like anything else, it's valid. One of the most harmful things the Internet ever did was to compile lists of argument types so that people who don't understand much about rhetoric can identify arguments by name and thereby believe they understand the arguments. It is exactly identical to a small child memorizing the names of dinosaurs.

You're more right than you know, except where you pretend that a slippery slope argument can be convincing without demonstrating why the presumed outcome will inevitably occur. And do read the rest of my comment next time, ok?
posted by MisantropicPainforest at 2:19 PM on August 4, 2014 [1 favorite]


(and you seem to be confusing rhetoric, which is the art of discourse, with logic/critical thinking. but you knew that)
posted by MisantropicPainforest at 2:24 PM on August 4, 2014


This is going to make Google cloud services an even tougher sell for businesses and government agencies. Come use our great services, where we probably won't share your internal documents with the police. Probably.

Except that Google cloud services for business and government are covered by completely different terms of service, and Google has explicitly stated that they don't scan content for those accounts (as described right in the initial post).
posted by me & my monkey at 2:25 PM on August 4, 2014


At the most trivial level, you can take a SHA-1 hash of an image (or any other piece of data) and compare it to another list of know SHA-1 hashes to see if there are hits. This is how you can compare chunks of data in a completely automated way, without actually looking at them, and with a nearly nonexistent collision rate. Law enforcement has a list, pictures that aren't on The List get ignored, and pictures that are get a big honking red flag.

This is precisely the implementation that a friend of mine who used to work for a UK hosting provider used.

The set of checksums of absolutely vile images were given to them by the police, and there were procedures in place for doing a takedown and alerting the police.
posted by Jerub at 2:27 PM on August 4, 2014


That is a classic slippery slope argument.

"We don't need you to type at all. We know where you are. We know where you've been. We can more or less know what you're thinking about... We can look at bad behavior and modify it." -- Eric Schmidt
posted by Mr. Six at 3:50 PM on August 4, 2014 [1 favorite]


[SHA1] is precisely the implementation that a friend of mine who used to work for a UK hosting provider used.

I really hope that the police are using pHash or its ilk, rather than just checking file hashes.
posted by ambrosen at 3:55 PM on August 4, 2014


Or to put it another way, who ever said email (especially unencrypted email), passed through the servers of a for-profit corporation, is private?

Now apply this to phone calls.
posted by dirigibleman at 4:14 PM on August 4, 2014 [4 favorites]


> If it's free, you are the product.

You're the product whether it's free or not. I have a free Yahoo mail account that I use as a spamcatcher, and I read the terms of service and saw where they said right up front "We read your mail to figure out what sort of shite you might want to buy so we can advertise it to you." Well, I pay money to my ISP (was Bellsouth.net, which morphed into AT&T) and as part of that deal they provide me with an email account. Now AT&T has made a deal with Yahoo to provide that paid-for service. When I check my mail now I have to log into Yahoo with my AT&T ID and password. I checked the terms of service for this paid-for account, and it's exactly the same as the free one. "We read your mail to figure out what sort of shite you might want to buy so we can advertise it to you."

No, I never actually see Yahoo's ads (AdBlock+ FTW) and yes I could set up a POP connection and download all my mail to my own machine without ever even seeing the Yahoo logo. But they're still reading it.
posted by jfuller at 4:31 PM on August 4, 2014 [1 favorite]


If you want to defend basic rights, you're going to have to defend some real scoundrels.
posted by anemone of the state at 4:47 PM on August 4, 2014 [5 favorites]


I'm having little difficulty imagining that scenario.

Indeed, a couple is suing Walmart over that scenario -- though in this case it was over pictures taken in for development. Another mother had her son taken away after developing photos which attempted to document abuse by the father.

Of course mistakes will happen in any law enforcement activity, but wow, you can ruin someone's life without even leaving your desk. And as qxntpqbbbqxl, without any trace of evidence of who set you up. What a great way to recruit informants, no?
posted by RobotVoodooPower at 4:56 PM on August 4, 2014 [3 favorites]


> email is really a lot more like an envelope than like a postcard. You have to take active steps to see the content

Since it's code you only have to take the steps once. It's not like you have to steam the envelope open again every time you want to snoop. Write once, and thereafter it's all plaintext.

In the age of Usenet, if you happened to have access and proper permissions to a big complete newsspool it was really no trick at all to write a little routine that could scan the whole content for mentions of l'il ole you. It wasn't just magical kibo, lots of folks had something scanning the netnoise for other folks taking their names in vain. Write once use forever, or at least until Usenet is no longer a thing.
posted by jfuller at 5:07 PM on August 4, 2014 [1 favorite]


An email provider that *already* tells you that they're scanning your email to decide whether to show ads for viagra, Christian Mingle, or homeopathic morgellons remedies, is also going to check to see whether your attachments' hash values match those of known child porn images? I'm OK with that.

I'd be less OK with some kind of algorithm that tried to detect child porn images by analyzing the picture, I can't believe that would ever work. But checking against a list of hashes like mhoye said? It's way the hell *less* invasive of my privacy to check whether I'm sending a file with the exact digital fingerprint of a child porn image than it is to analyze my keywords to decide what ads to show me. Seriously. WAY less invasive than the stuff they've been doing openly for like a decade or whatever. And way more beneficial to society.
posted by edheil at 5:11 PM on August 4, 2014 [2 favorites]


That is a classic slippery slope argument. And if that happens, then I'll have an issue with it. Until then, no.

I would argue that you would actually have to go out of your way and try really hard to construct anything closer to the platonic ideal of a slippery slope. This is like, that giant slip n slide in utah or something.

The comments above are spot on about this being a deft way to admit they're analyzing stuff people upload not just for ads, but for general content.

This isn't a matter of if, but when. It's like leaving a gun out on the coffee table in front of a little kid, and then saying it's a slippery slope argument to say they might pick it up and play with it. We're standing here staring at what is one the closest i've ever seen to a literal slippery slope that didn't just have a big sign on it.

So yea, i don't know, i think this is a poor and lazy use of just going "oh, that's a slippery slope argument". Especially as someone who feels that gets trotted out a bit too often before much critical thought is applied to the individual circumstances and situation, quite a bit like tone argument.
posted by emptythought at 6:06 PM on August 4, 2014 [7 favorites]


I think it's rather terrifying that the bulk of written, personal communication now is no longer considered "private."

Oh, and if you are one of the few who do bother encrypting your email, you'd better bet you'll be singled out for heightened scrutiny.
posted by Zalzidrax at 6:55 PM on August 4, 2014 [2 favorites]


You're the product whether it's free or not.

There's a difference between services you actually pay for and services provided in a big bundle as sort of an afterthought by companies that are hostile to you anyway. Compare Fastmail:
Data mining and profiling
We do not sell or give information about our users to any third parties. Payments are securely handled via Pin or PayPal; your credit card details are never transmitted to our servers. Pin store your credit card details and address for the purpose of future payments with FastMail, unless you have requested your payment details not to be stored. Pin's privacy policy is available at https://pin.net.au/privacy. PayPal's privacy policy varies depending on your country of residence; you can select your country to find the relevant privacy policy at https://www.paypal.com/webapps/mpp/ua/legalhub-full.

Incoming messages are scanned for the purpose of spam detection unless you disable spam protection for your account. We may also scan some outgoing messages with the same software to prevent people using our service to send spam. Emails you report as spam are automatically analysed to help train our spam filter. Also, if enabled, emails reported as spam are forwarded on to some external email reporting services. These services aim to help monitor and reduce overall spam on the Internet. Currently the services we report to are Return Path and LashBack. These may change in the future. If you don't want this, you can disable the reporting in the FastMail advanced settings.

To make message searching fast, we build an index of your messages (this is a table, just like you would find at the back of a reference book, in which you can look up a word to quickly find the emails in which it appears).

No information from any of these activities is used for any other purpose, or to compile any kind of profile on our users.
posted by vibratory manner of working at 7:01 PM on August 4, 2014 [3 favorites]


[SHA1] is precisely the implementation that a friend of mine who used to work for a UK hosting provider used.
...
I really hope that the police are using pHash or its ilk, rather than just checking file hashes.


Just to add to this: There's been plenty of research into fingerprinting algorithms for images that will survive all kinds of transformations: rotation, scaling, re-compressing, messing with colors, limited cropping, etc. Google is almost certainly not using SHA1.
posted by qxntpqbbbqxl at 10:52 PM on August 4, 2014


So, I saw thumbnails of child porn when using google image search--looking for a car a friend was raving about (I had no idea what the car looked like, 1980 being the cut-off for me being able to identify anything with wheels on, reliably). I have no idea what the connection might have been between my search terms and the images, (and never discovered, since I wasn't going to click on them), but there they were.

Was I guilty of something? Was google? And if it was google, what's the penalty for them? Do they get a detective assigned and Eric Schmidt gets thrown in jail? If they can detect them in their user's email, how come they cannot detect them in their own search engine? If I was guilty of something, that's insane--you could jail anyone at any time simply by injecting an image into a web page they were viewing or send them an email.

Why do corporations get a free pass on things regular humans don't?
posted by maxwelton at 11:03 PM on August 4, 2014


The mundane answer is that if they did nothing about it, they probably wouldn't get a free pass once a significant story hit the headlines.

It's not a perfect system so it doesn't stand to reason that a single known bust involving email should imply 100% success with filtering it out on GIS, and GIS provides a simple mechanism to report images to help them filter out and identify badness before it gets out of control.

Similar to takedown notices on YouTube -- they were getting a free pass for quite some time, but eventually had to start following takedown notices, which escalated after their acquisition by Google, to ensure that Google would be seen as taking legitimate efforts to reasonably curb copyright infringement.
posted by aydeejones at 2:01 AM on August 5, 2014


So, I saw thumbnails of child porn when using google image search--looking for a car a friend was raving about [....] Was I guilty of something? Was google? And if it was google, what's the penalty for them?

How long ago was this? I bet Google uses image matching to improve search results (the search was for tourism in Paris, and here's a page with lots of pictures of the Eiffel Tower). I wouldn't be surprised if Google started matching images to a hashed list of child porn just so that it could avoid serving the results in its Image Search. And they may very well use the same matching software for all their searches, including email (to provide contextual advertising) which means that they will inevitably identify email with child pornography. At that point they have to do something - either deliberately ignore the crime, or report it. So the privacy breach actually took place back when they decided to scan emails; this is just the inevitable conclusion.
posted by Joe in Australia at 3:03 AM on August 5, 2014


I know of one major western country where a major ISP and the local child-pornography fighting foundation had big reservations about delivering a list of hashes of urls (mostly to servers in countries that could not care less about it) for the simple reason that the local version of the RIAA had stated that the moment providers would start using that list to block sites, they would go to court to demand that the same ISP block a list of URLs to their liking.

It's not so much a slippery slope as a slippery cliff.
posted by DreamerFi at 6:37 AM on August 5, 2014 [2 favorites]


if you are one of the few who do bother encrypting your email,

One of the greatest failures of Internet development is the lack of a usable end-to-end encryption and authentication mechanism for email. We've had the pieces of it for 23 years now, with PGP, but no one has ever succeeded in building a conveniently usable version of it. Email could be way more secure than a letter in an envelope, not to mention a postcard, and it's a failure of product design we don't.

Hosted mail like Hotmail or Gmail was the final nail in the coffin for encrypted email. There's way too much value to the consumer in letting a company like Google access the plaintext of mail now, end to end encryption foils that.
posted by Nelson at 7:57 AM on August 5, 2014 [1 favorite]


The article suggests the way the picture was identified as child porn was that the attached picture was checked against a database of known child porn pictures, so Google isn't really scanning content just to create targeted advertising, they're specifically checking to see if you're involved in criminal activities. I'm floored by the amount of surveillance we live with today.
posted by xammerboy at 11:08 AM on August 5, 2014


Just to add to this: There's been plenty of research into fingerprinting algorithms for images that will survive all kinds of transformations: rotation, scaling, re-compressing, messing with colors, limited cropping, etc. Google is almost certainly not using SHA1.

Especially since google images search itself has a quite advanced "Search for instances of this image here and similar images" function in which you can upload or link to a file and have it compare.

It isn't a simply hash, or identical search. It's very, very smart. I would imagine that their non public search and comparison tools are even more powerful than what they're willing to show their hand on.

This is not a fight i would want to go up against google on. Google is essentially the US armed forces of information processing. They have pretty much unmatched algorithms, know how, minds, and even just raw computing power they can throw at things.

Just the tech they designed to make their self driving car not run over a cat is probably beyond what you could imagine, as far as figuring out how to identify a modified photo against others. And they seem to be very strong at moving tech in to parallel uses.

Hate them if you want, there's plenty of valid reasons, but they're like mike tyson in the 80s right now at this.
posted by emptythought at 4:09 PM on August 5, 2014


How long ago was this?

Hm. At least two years ago, maybe three?
posted by maxwelton at 7:28 PM on August 5, 2014


« Older Crude Translation: "Happy Heartbeat"   |   "a story about how Steam, Twitter and the App... Newer »


This thread has been archived and is closed to new comments