Sure, that might lead to a dystopian future or something, but
January 18, 2020 8:43 AM   Subscribe

SLNYT: The Secretive Company That Might End Privacy As We Know It By Kashmir Hill
The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.
Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy.
“The weaponization possibilities of this are endless,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University. “Imagine a rogue law enforcement officer who wants to stalk potential romantic partners, or a foreign government using this to dig up secrets about people to blackmail them or throw them in jail.”
posted by Monochrome (66 comments total) 36 users marked this as a favorite
 
must. crush. capitalism.
posted by entropicamericana at 9:03 AM on January 18, 2020 [23 favorites]


Rogues? Who’s worried about rogues?!
posted by chavenet at 9:05 AM on January 18, 2020 [18 favorites]


rogue law enforcement officer

i.e. all of them.
posted by Greg_Ace at 9:09 AM on January 18, 2020 [23 favorites]


ACAR
posted by Rust Moranis at 9:20 AM on January 18, 2020 [17 favorites]


While the company was dodging me, it was also monitoring me. At my request, a number of police officers had run my photo through the Clearview app. They soon received phone calls from company representatives asking if they were talking to the media — a sign that Clearview has the ability and, in this case, the appetite to monitor whom law enforcement is searching for.
posted by stevil at 9:26 AM on January 18, 2020 [30 favorites]


I'm thinking that this company didn't clear all those photos they scraped. Wonder how well they'd weather a few thousand copyright lawsuits.
posted by NoxAeternum at 9:28 AM on January 18, 2020 [10 favorites]


One reason that Clearview is catching on is that its service is unique. That’s because Facebook and other social media sites prohibit people from scraping users’ images — Clearview is violating the sites’ terms of service.
posted by little onion at 9:36 AM on January 18, 2020 [3 favorites]


$10 says they will accidentally leave their photo archive on an unsecured S3 bucket, enterprising pirates will steal it, and it will be available to torrent for everyone
posted by BungaDunga at 9:39 AM on January 18, 2020 [37 favorites]


To be clear I am not saying this would be a good thing but once a crappy startup with 5 employees builds a database like this, a full leak of the archive just seems inevitable
posted by BungaDunga at 9:46 AM on January 18, 2020 [9 favorites]


The young entrepreneur himself states the core challenge: “Laws have to determine what’s legal, but you can’t ban technology."

Many science fiction authors and others have argued that we are sort of returning to 'small town living', where everyone knows everything about everyone else. Maybe the best achievable approach is to demand these tools are available to eveyone, not just government agencies?
posted by PhineasGage at 9:46 AM on January 18, 2020 [1 favorite]


The idea you can't ban technology is ridiculous. Of course you can ban technology!

For example, if you set up a pirate radio station, you will get a knock on the door from some very forceful FCC agents...
posted by BungaDunga at 9:50 AM on January 18, 2020 [13 favorites]


You could almost label the guy a cartoon villain-level CEO (a failed male model working with Republican operatives to build a dystopian all-seeing face tracking software!) except for the fact that he hasn't really thought through any of this. He clearly doesn't have some kind of master plan except for building up his fat stacks as fast as possible.

But some portion of this does echo for me what happened with Adam Neumann (another wacko cartoon villain CEO archetype, though this fascinating opinion piece might lead you to re-consider) and WeWork and tech writ large (hopefully) in that what is going on is so egregious that there MUST be some kind of backlash both public and regulatory (right? RIGHT?). Basically, our capitalist system has encouraged these enterprising young lads to push it to the absolute edge of legality and now the system has to course correct (right????).
posted by jng at 9:58 AM on January 18, 2020 [7 favorites]


I'm thinking that this company didn't clear all those photos they scraped. Wonder how well they'd weather a few thousand copyright lawsuits.

You might want to check the fine print you agree to on all those sites if you believe this to be a possibility.

Hint: the copyright owner is no longer you.
posted by sideshow at 9:59 AM on January 18, 2020 [5 favorites]


Closing down pirate radio stations is not banning am/fm technology.
posted by PhineasGage at 10:07 AM on January 18, 2020 [3 favorites]


CV Dazzle
Camouflage from face detection.

CV Dazzle explores how fashion can be used as camouflage from face-detection technology, the first step in automated face recognition.

The name is derived from a type of World War I naval camouflage called Dazzle, which used cubist-inspired designs to break apart the visual continuity of a battleship and conceal its orientation and size. Likewise, CV Dazzle uses avant-garde hairstyling and makeup designs to break apart the continuity of a face. Since facial-recognition algorithms rely on the identification and spatial relationship of key facial features, like symmetry and tonal contours, one can block detection by creating an “anti-face”.
posted by gucci mane at 10:08 AM on January 18, 2020 [14 favorites]


Usually the fine print just grants copying rights to Facebook for the purpose of displaying your photos on the site, making backups, and sometimes advertising with it. You're still the copyright owner and could still, for instance, upload your photo to Twitter and give them the same rights to the photo as you gave Facebook.

If you had actually handed over all your rights to Facebook, you would be violating Facebook's copyright by uploading your photo to Twitter or printing it for a photo album.
posted by BungaDunga at 10:08 AM on January 18, 2020 [13 favorites]


Closing down pirate radio stations is not banning am/fm technology.

If the government decided to revoke all broadcast licenses tomorrow, legal broadcasting would stop happening. Same thing if you banned police from querying bootleg facial recognition databases. The practical effects are pretty important even if the technology per se still exists.
posted by BungaDunga at 10:13 AM on January 18, 2020 [2 favorites]


The young entrepreneur himself states the core challenge: “Laws have to determine what’s legal, but you can’t ban technology."

The proper response to this is to tell him that technology does not supercede the law - while smacking him with a legal sledgehammer so hard that everyone gets the message.

It is past time that we told the tech industry that running ahead of the law means leaving its protection.
posted by NoxAeternum at 10:14 AM on January 18, 2020 [38 favorites]


Maybe the best achievable approach is to demand these tools are available to eveyone, not just government agencies?

As someone who moved four times in one year to make myself unfindable by someone who threatened to kill me or harm my family if I ever left him, I do not want this.

Even now, I have to have the awkward conversation with photographers at work events as to why I do not want to be in group photos if there is any possibility that the photo will ever be posted on our work site or social media. Maybe I don’t have to tell them that I was on the receiving end of domestic violence, but I worry that if I don’t they will not understand what the stakes are for me.

I have profoundly strong feelings on this subject.
posted by Silvery Fish at 10:29 AM on January 18, 2020 [81 favorites]


There's also the right to control the use of one's likeness for commercial purposes -- which is distinct from copyright -- and statutory privacy regimes (like California's, which has just gone into effect).

It was probably inevitable that this would be developed and marketed by someone, but how society responds to it is a choice.
posted by snuffleupagus at 10:32 AM on January 18, 2020 [2 favorites]


all i'm going to say about pirate radio is so far, it's not been that easy for the FCC to shut it down

and this is technology that is being used quite publicly, so that anyone with a radio can hear it

a database of people's faces along with the programs to sort through it and identify individuals? - anyone could have that and use it quite secretly - any member of metafilter might have it and we would never know

the only way to ban that is to ban computers

no, you can't ban technology - busting individual people for using it isn't banning it
posted by pyramid termite at 10:33 AM on January 18, 2020 [2 favorites]


a database of people's faces along with the programs to sort through it and identify individuals? - anyone could have that and use it quite secretly

Anyone could be paying for access to it, if allowed by the gatekeepers, but this isn't a ZIP of rainbow tables. It requires significant infrastructure, data storage and computing power. It has been driven by investment and its purpose is to make money. It could be shut down, as a commercial enterprise, were there sufficient consensus and will to do so.
posted by snuffleupagus at 10:35 AM on January 18, 2020 [10 favorites]


snuffleupagus: "statutory privacy regimes"

See also Europe's GDPR, which California used as a model.
posted by chavenet at 10:46 AM on January 18, 2020 [1 favorite]


no, you can't ban technology - busting individual people for using it isn't banning it

This is "true" in the same sense that "DRM doesn't work (because it doesn't stop piracy forever)" or "forcing 8chan/kun/whateverthefucktheyaretoday to the deep web doesn't make them go away" are "true". Yes, it's not explicitly "banning" it, but making it clear that openly scraping websites for photos is opening the door to legal issues will cause the money to dry up, because nobody is going to fund a company whose main product is lawsuits against it.
posted by NoxAeternum at 10:50 AM on January 18, 2020 [19 favorites]


Well they've cleverly courted connected people and (mostly GOP) politicians, so I wouldn't bank on laws limiting this sort of thing happening soon.
posted by sjswitzer at 10:53 AM on January 18, 2020 [2 favorites]


The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites
Those social media things just keep on giving, don't they?
posted by Kirth Gerson at 10:59 AM on January 18, 2020 [6 favorites]


all i'm going to say about pirate radio is so far, it's not been that easy for the FCC to shut it down

They know where the hardware is. The only thing stopping them seizing it through something like an in rem case seems to be the legal regime: "If they could catch someone red-handed, they could obtain a signature on the infamous FCC “notice of apparent liability,” the agency’s equivalent of serving someone with a legal document, a way to nab pirates." Just because the current laws are maybe not strong enough doesn't mean the practical ability to do it doesn't exist: send in the FBI, seize everything, tie it up in a court case until the operator either says it's theirs or win by default.
posted by BungaDunga at 10:59 AM on January 18, 2020




I was thinking it might be time to make niqabs extremely fashionable, since dazzle makeup is much more of a pain to put on and take off. Bonus: bothering bigots.
posted by bile and syntax at 12:02 PM on January 18, 2020 [4 favorites]


On the one hand this article reads like an exposé, but I wonder if it isn't also a sort of dog whistle form of advertising. How many new customers will Clearview get because some people see "might lead to a dystopian future" not as a bug but as a feature?
posted by swr at 12:14 PM on January 18, 2020 [9 favorites]


SWR - of course. It is literally impossible to buy publicity this good.

The article is basically:

(1) cops from all over saying it's an amazingly effective inexpensive tool - broadly-sourced new material data.

(2) civil libertarians expressing alarm - so obviously predictable that even the reporters and editors could barely muster interest in it.

At this point, what police department -- and, probably more to the point, intelligence / domestic security agency, and what corporate security and investigation division wouldn't want this.

I do wonder how Thiel stays on the board of FB as the key angel investor in a company that, whatever else it did, built its entire business on violating the Instagram and Facebook TOS.
posted by MattD at 12:38 PM on January 18, 2020 [14 favorites]


This technology isn't just inevitable, it's already happening. Not just in a little-known American startup selling to US law enforcement. As the article says Facebook and Google could be doing this right now if they want, both have limited deployments of face recognition in their photo products. The only reason they haven't is they know it's creepy and the benefit doesn't yet outweigh the value. I'm told early versions of Facebook's image labeling did include very accurate and broad name suggestions, which they disabled after negative feedback.

What's missing today is easy public access to facial recognition systems. But I wrote a blog post last week about how Yandex's image search is also apparently doing facial recognition, and you can use it right now, for free, to do a good-enough facial recognition for anyone with a labelled photo on the web. Yandex's database is focussed on Russian websites so its US database isn't as huge as Google or Facebook's, but it's good enough to find a lot of people. There's also FindClone, which searches VKontakte images (the Russian Facebook equivalent).

But the huge deployment of facial recognition is in China, particularly Xinjiang. It's been extensively documented, for instance here. Right now the federal US government does this kind of thing too, somewhat, but in limited deployments like at border crossings.

I'm of the fatalistic bent that technology like facial recognition can not usefully outlawed. That it's another inevitable erosion of privacy, like ad tracking databases and reverse phone number searches. You can try to outlaw the application of facial recognition but I fear it's probably too late and that there'd not be enough support for a privacy position. It's just too valuable a capability to marketers, to police. I suspect facial recognition systems will be as commonplace as reverse phone books in just a few years.
posted by Nelson at 12:51 PM on January 18, 2020 [10 favorites]


So what's wrong with bringing back face veils for everyone? Start now before they make it illegal. Wide brim hats with semi-transparent veils seem pretty good for this. You don't want to wear something that doesn't look good? I do. And so would a lot of other people.

(Added benefit [to me] is to piss off all the security types while it's playing out before they ban it)
posted by aleph at 1:40 PM on January 18, 2020 [1 favorite]


It's common for protestors in Hong Kong to wear masks to avoid Chinese facial recognition. The practice has been outlawed a couple of times but the ban is sometimes ignored.
posted by Nelson at 2:28 PM on January 18, 2020 [3 favorites]


The ability to amass, analyze and cross-correlate huge amounts of data has been a real game-changer. The science behind face recognition was only waiting to be discovered, but it could never have been done without these huge datasets.

In the early days of the internet I was involved in a project to create a large collaboration system. I figured I needed to bone up on some of the technologies I'd need so I bought a book (as you did, then) titled "Managing Gigabytes."

I still have it on my bookshelf for the irony alone.
posted by sjswitzer at 2:45 PM on January 18, 2020 [6 favorites]


We need to use the freedom we have now (before it's gone) to spike the facial recog wherever we can. We can control what we wear (now) so poison the reliability of the tracking of face recog by lots of people wearing it semi-randomly. Won't be able to stop the use of it at choke points where the veil won't be allowed (but push at those to minimize them legally). Start making the data sets as poisoned as possible.

Flash mobs that go into building wearing cloaks and veils => everybody exchange => go back out to street/cameras
posted by aleph at 3:12 PM on January 18, 2020 [5 favorites]


Usually the fine print just grants copying rights to Facebook for the purpose of displaying your photos on the site, making backups, and sometimes advertising with it. You're still the copyright owner and could still, for instance, upload your photo to Twitter and give them the same rights to the photo as you gave Facebook.

Commercial entities should be required to track the provenance of the data they claim as assets and that they use in their business. Model it on Know Your Customer banking laws, maybe something like "Know Your Uploader." This would create a chain of custody that can be traced for infractions as well as providing a map for revocation. And this goes for everything, commercial email, where robocallers got their phone numbers...all of it.

Companies should be required to have a license to our data, which can be suspended or revoked. Not only should people should be able to revoke Facebook et al's usage of their information, all (all) derivative uses of that information should be revocable. That's right: if you revoke Facebook or Amazon's use of your data, they have to recalculate all of their models, as well as derivatives of those models.

Every step in technology in modern public life encroaches disproportionally on people with fewer resources and less power in general. We should be allowed to check out of this regime just like we can decline to have a driver's license. The only reason this isn't the case is because Bezos, Thiel, and Zuckerberg are bigger political donors than the rest of us.
posted by rhizome at 3:41 PM on January 18, 2020 [8 favorites]


(1) cops from all over saying it's an amazingly effective inexpensive tool - broadly-sourced new material data.

(2) civil libertarians expressing alarm - so obviously predictable that even the reporters and editors could barely muster interest in it.

At this point, what police department -- and, probably more to the point, intelligence / domestic security agency, and what corporate security and investigation division wouldn't want this.


"Beautiful, unethical, dangerous."
posted by snuffleupagus at 4:05 PM on January 18, 2020


Nuclear weapons exist as a technology. They are rightly recognised as very dangerous.

Building them requires access to a series of capital intensive technologies that are challenging to string together effectively. Using them has horrific consequences and is generally regarded as a Very Bad Thing by most people.

Mass surveillance is similar in multiple respects, and should be banned for similar reasons.

Mass surveillance has no legitimate place in a liberal democracy.
posted by But tomorrow is another day... at 4:18 PM on January 18, 2020 [7 favorites]


I might add, SETEC ASTRONOMY came to mind first. But this is more on the Dark Knight omni-surveillance level of disturbing.

See also Enemy of the State.

I feel like the jokes are starting to bend backwards. This is the cyberpunk future we were promised. Thanks, I hate it.
posted by snuffleupagus at 4:29 PM on January 18, 2020 [8 favorites]


Yeah, I'd prefer the jetpack, thanks...
posted by PhineasGage at 4:32 PM on January 18, 2020 [6 favorites]


Nuclear weapons ... Mass surveillance is similar in multiple respects

But it's not, not really. I can build, from scratch, a basic functioning facial recognition system on my home gamer computer. I can build a very good one with, oh, 100 computers of similar capability. The harder part is the training data, getting ahold of lots of labelled photos of people. But that's not so hard either. Doubly so if you're willing to just outright lift the photos without permission by, say, scraping Facebook or VKontakte. Or the web, which is what Yandex did. Anyway it's doable and doesn't take anywhere near a nation-state's resources to do. Unlike a nuclear weapon.
posted by Nelson at 4:36 PM on January 18, 2020 [6 favorites]


Yeah, I'd prefer the jetpack, thanks...

We can't even manage a future without potholes!
posted by rhizome at 4:52 PM on January 18, 2020


I can build a very good one with, oh, 100 computers of similar capability.

What does "very good" mean in that context? Fast? Accurate? Both?

What's 100xAverageGamerComputer in terms of CPU cores these days? 600? 1200?

Just trying to get a sense of scale.
posted by snuffleupagus at 4:55 PM on January 18, 2020


The ability to amass, analyze and cross-correlate huge amounts of data has been a real game-changer.
We have seen the future, and it belongs to those who can collate and cross-reference.
posted by Pouteria at 5:05 PM on January 18, 2020 [3 favorites]


I hope someone is working on something like this to fool AI face recognition!
posted by monotreme at 5:15 PM on January 18, 2020


Then there's reassuring news like The study of dozens of facial recognition algorithms showed "false positives" rates for Asian and African American as much as 100 times higher than for whites.

The product works great on Freedom Loving Cis White Males, so there's nothing to worry about here Citizens!
posted by monotreme at 5:24 PM on January 18, 2020 [4 favorites]


"Mass surveillance has no legitimate place in a liberal democracy."

I suggest that mass surveillance should be done in such cases as body cams for cops. And whoever else in the public sector that needs that kind of surveillance. I don't object to the data going through some kind of filtering (that has to be worked out) but I don't want it defined by the police union (or equivalent).
posted by aleph at 6:07 PM on January 18, 2020


Should probably remember the backers ID'ed in the article, just in case.
posted by ZeusHumms at 6:18 PM on January 18, 2020


Let's all print off and wear masks with faces from This Person Does Not Exist and pit machine learning against itself.
posted by thebots at 6:37 PM on January 18, 2020 [2 favorites]


What's 100xAverageGamerComputer in terms of CPU cores these days? 600? 1200?

I am somewhat talking out my ass about "100x gamer computers" being enough for high quality facial recognition. I think that's the right order of magnitude but I haven't actually done the work. Maybe it's 1000, I dunno. Also total hand waving about how long those computers need to run. In reality you're probably gonna lease a bunch of machines in the cloud to do this work. One last fudge in the favor of "this is not a hard problem": I'm talking about computers needed to build the recognition model. Actually executing the model once it's trained to recognize individual faces is much cheaper. Your cell phone already runs a version of that just to let you unlock the stupid thing.

But by "gamer computer" I mean something with a modern NVidia GPU, which is the important part for machine learning. A GeForce 2080, the current near top of the line, costs about $700 retail. It does ~10 TFlops of computing (single precision). For comparison the first 10 TFlop supercomputer was built around 2001. It's astonishing how much compute power we've developed in graphics cards. And machine learning systems eat that stuff up very efficiently.
posted by Nelson at 7:34 PM on January 18, 2020 [6 favorites]


Should probably remember the backers ID'ed in the article, just in case.

Just in case the pitchforks and torches get hungry?
posted by Greg_Ace at 8:25 PM on January 18, 2020 [1 favorite]


Hint: the copyright owner is no longer you.

No, the terms grant Facebook or Whatsapp or whomever a license to distribute the images you upload, possibly to people you don't expect or in ways you don't expect. That does not make them the owner of the work. Having licensed your images to Facebook doesn't stop you from enforcing your copyright on the images you create against people who obtained them without a valid license, like by scraping them from Facebook. (Assuming they are distributing those images, which they appear to be when they return a search result)
posted by wierdo at 9:24 PM on January 18, 2020 [1 favorite]


We’re living through the effects of the Koch brothers doing their best to purchase the country they wanted, and it’s been downright terrifying. I’m not really interested in seeing the world Thiel is trying to buy for himself, but I don’t doubt the time for any of us to have a say in the matter has long passed.
posted by Ghidorah at 10:33 PM on January 18, 2020 [2 favorites]


We have seen the future, and it belongs to those who can collate and cross-reference

I thought the future would be cooler.
posted by [insert clever name here] at 11:17 PM on January 18, 2020 [1 favorite]


This is really something godawful, isn't it? Our lives, save a (mostly) self-selected few, were "footprints in the snow", now we're all "footprints in the fast-drying, diamond-hardened cement".
posted by Chitownfats at 12:09 AM on January 19, 2020 [4 favorites]


"As the article says Facebook and Google could be doing this right now if they want, both have limited deployments of face recognition in their photo products."

Facebook and, especially, Google could know almost everything there is to know about 90% of the population of the US. They already have the data or could virtually flip a switch to collect what they don't. They could know who you literally sleep with each night, and in which room in which house, and at what times. Google could almost instantly assemble a database of photographs and speech of every Android user, couple that to recognition software, and track the identity, time, and location of every person near an Android phone, including non-Android users when augmented by the data available from their trawling of the web. They could know all your social relationships, no matter how clandestine, link this to financial and shopping activity, employment and educational history, and with a non-trivial amount of confidence predict where you will be in a week or a month or a year, and with whom.

Universal surveillance already exists as recorded data; a surprisingly large amount of that data is already collated. Pattern recognition in the form of the burgeoning neural net tech popularly referred to as "AI" has made it practical to usefully utilize such vast amounts of disparate data. As Nelson mentions, though training a neural net is computationally expensive, using it is relatively cheap. The data itself is the most valuable and difficult to acquire piece of this — and Google and Facebook have created a technological ecosystem in which everyone just offers this up to them, gratis.

The existing data could be legally locked-down, though how effective that would be is an open question. But this is the "information age" and we individually and collectively create an enormous amount of value by constantly generating and using this information, we're not going to stop and won't want to stop, and it will be increasingly easy to regenerate anything previously locked away. The potential for misuse is inescapable.

I've been arguing for twenty years that rather than fretting about the technology and devising schemes to slow its advance, what we've desperately needed to do is to get ahead of it both legally and culturally. It's a short-lived accident of history that we've been accustomed to the sanctity of privacy primarily as a function of the practical difficulty of violating it. That is not the historical norm. Aggressive legal regimes and deeply-ingrained codes of etiquette are the only ways to effectively protect privacy going forward. We're tardy in comprehending and addressing this; unless we act with urgency, things will get much worse before they get better ... if they get better.
posted by Ivan Fyodorovich at 10:30 AM on January 19, 2020 [11 favorites]


But how do we, as a culture, get a head of the destruction of privacy via inevitable technologies?

The only really novel approach I've seen is that articulated in David Brin's book The Transparent Society (shorter form in this Wired article). In a nutshell he argues we should give up on trying to preserve privacy and instead embrace the panopticon. But make sure it is equal and universal. The government tracks regular people's income to make sure they pay taxes, so let's be sure we also track government officials and rich people's income the same way, to make sure they also pay taxes. This argument was not popular when it was made and has not aged well; it seems impossible in the face of power imbalances. But at least the argument was novel.

The other sorta getting ahead of it is serious privacy efforts like Europe's GDPR with novel ideas like the right to be forgotten. But it's not clear the GDPR is having a significant effect in practice, it may be too little too late.

One reaction among the Young People is reportedly to prefer media, etc that are ephemeral. Forget Facebook and it's permanent archive, use ephemeral media like SnapChat instead. But I'm not convinced that's really a big enough trend to shift the culture. And even if it is, SnapChat is as much a tool of surveillance capitalism as everything else.

I brought up reverse phone books awhile back. It used to be you had no way of knowing the name of who owned a phone number; numbers were anonymous. That hasn't been true for ~40 years now, but people still act like it is, even today. As a society we've survived this erosion of privacy somehow. Maybe there's a lesson in that that can be applied to the way we now just about have a way of knowing the name for a face?
posted by Nelson at 10:45 AM on January 19, 2020 [4 favorites]


As a society we've survived this erosion of privacy somehow.

Signs unclear, ask again later.
posted by snuffleupagus at 10:49 AM on January 19, 2020 [3 favorites]


I think what makes this particularly disgusting is that a lot of low-security online presences are now mandatory for employment, or in other spaces. I've been unemployed for just about a year now, and trying to get work again without a very revealing LinkedIn account (in terms of data rawly available on my public profile), along with a paired picture, has proven pretty difficult. I've tried to anonymize my account for it as much as possible, but my younger partner (millennial-verging-on-Gen-Z) has suggested that doing so might be hindering my odds. This database and accompanying toolset is exactly the kind of horrifying bullshit that lurks halfway down that road. The proverbial end of that road is something I hardly want to imagine.

These things are all the more on my mind since the collapse of an online friendship. A pseudo-anonymous, spurned creep in another country decided to make threats to "ruin my life" in retaliation. I was a fool to trust this person, and now they have my name, private photos of me, and a handful of handles previously associated with me. I asked a friend familiar with infosec what was the worst this guy could do, and there were quite a few options available. This wouldn't all be so bad if socializing, gaining employment, and networking for my art and hobbies didn't all require a leaky online presence.

For the youngest generation now reaching adulthood, the transition is already complete. My partner has internalized that privacy is elusive, or sometimes a risible expectation. They happily post intimate details in publicly accessible places. When I talk about personal privacy, they groan and roll their eyes. I got in trouble for calling Gen Z and late Millennials "raised-to-be bootlickers" recently, however it's quite true. This isn't to attribute agency to them, as Gen Z kids are hardly at a point where they can be making rational decisions. It's just sad to see that they've been conditioned, not by their parents, but by the corporate controlled-environment. Privacy is already dead.
posted by constantinescharity at 11:05 AM on January 19, 2020 [6 favorites]


No way to put this genie back into its bottle, so, can there be a way to define who gets to use it to make a wish? Is there another, similar, technology we can examine for guidance?

We are left with a rapidly fading illusion of privacy. Actual privacy may have already disappeared. We've gone quickly through phases where some were able to live "off the grid" and others were able to minimalize their footprint, but even that illusion was snuffed out by the ubiquity of online banking and cash-register metadata.

Our last useful phase was the anonymity of the herd; that lasted less than a generation. What moral imperative do we cite now? It's clear that religion is useless as an overarching guide, and laws always lag behind crime, and now crime is given an exponential head start with each iteration of gadgetry. I say "crime," but I realize that I'm thinking more about some moral center than legal recourse. It's just that I hate to use phrases like "bad deeds" because I don't want to have to deal with all that confusing relativism.

This thread has been blessed with opinions that range from the evil individual to the soul-less corporation, to the Orwellian state. Is freedom chained to privacy? Does an alternative to Dystopia exist?

Oh, wait. I know. We can make users of this technology click the box (on the download page) that makes them swear that they are over 18 years old, and they promise not to use the program for any evil purpose.

Either that, or I have run out of ideas.
posted by mule98J at 11:46 AM on January 19, 2020 [4 favorites]


> Mass surveillance is similar in multiple respects

Facial recognition is the plutonium of AI
posted by Monochrome at 5:09 PM on January 19, 2020 [2 favorites]


Cryptography and security expert Bruce Schneier shares his opinion in the NYTimes
The problem is that we are being identified without our knowledge or consent, and society needs rules about when that is permissible.
Similarly, we need rules about how our data can be combined with other data, and then bought and sold without our knowledge or consent. ...
Finally, we need better rules about when and how it is permissible for companies to discriminate.
I think the "consent" thing is 100% useless garbage. Currently what we get for "consent" using online products is "by using this software, you consent to us tracking all sorts of creepy details we don't disclose about you". Often backed by an incomprehensible 20 page terms of service that you can't meaningfully opt out of, anyway. GDPR added "consent" for tracking cookies and all that resulted in is annoying popups every clicks "yes" on.

What would work is meaningful, opt-in consent. Maybe a system where my face is not allowed to be recognized unless I personally consented to that actual application of facial recognition. Of course no business would ever agree to that, much less a surveillance-happy government. Instead we have opt-out consent where 99% of people never make a conscious choice. Or no consent at all, but the effect is the same.
posted by Nelson at 7:58 AM on January 20, 2020 [2 favorites]


Illinois has quite effectively banned facebook from displaying face recognition and image tagging for users in the state. It has also stopped Google fit from releasing aggregate statistics in its app which it used to do.

They are potential violations of Illinois’ Biometric Information Privacy Act and can lead to class action litigation.

This company is likewise probably breaking that law.
posted by srboisvert at 10:38 AM on January 20, 2020 [2 favorites]


Yeah, to be succinct: I think every privacy violation that a) causes sufficient harm (individually or collectively) and b) can in practice be addressed by law, be addressed by law. Everything short of that standard, culture needs to address via changing etiquette and taboo.

The first is completely possible, though the chief obstacles will be the powerful interests threatened by such laws. But that is very often a problem. Activism is necessary.

The second will happen sooner or later. But we can speed that up a bit by raising awareness of the need to adjust and leading by example. An obvious good example of this is with doxxing. Technological changes made this trivial, it's abused, people begin to form and promulgate notions of right and wrong about doing it.

First and foremost, though, is using the rule of law to restrain powerful interests from exploiting big data to build a horrible dystopia. Laws can be written, enforced, and prove effective. We don't have to accept the coming of the dystopia — but wringing our hands and wanting to reverse the clock to a time when it wasn't possible and we didn't need to worry about it just delays the activism needed to push for legislation. The genie isn't going back in the bottle; we have to act.
posted by Ivan Fyodorovich at 11:27 PM on January 20, 2020 [2 favorites]


So, it turns out that the guy behind Clearview has all kinds of past associations with alt-right figures. From Buzzfeed, "Clearview AI Says Its Facial Recognition Software Identified A Terrorism Suspect. The Cops Say That's Not True." (emph. added):
While Ton-That has erased much of his online persona from that time period, old web accounts and posts uncovered by BuzzFeed News show that the 31-year-old developer was interested in far-right politics. In a partial archive of his Twitter account from early 2017, Ton-That wondered why all big US cities were liberal, while retweeting a mix of Breitbart writers, venture capitalists, and right-wing personalities.

[...]

Those interactions didn’t just happen online. In June 2016, Mike Cernovich, a pro-Trump personality on Twitter who propagated the Pizzagate conspiracy, posted a photo of Ton-That at a meal with far-right provocateur Chuck Johnson with both of them making the OK sign with their hands, a gesture that has since become favored by right-wing trolls.

[...]

By the election, Ton-That was on the Trump train, attending an election night event where he was photographed with Johnson and his former business partner Pax Dickinson.

[...]

While there’s little left online about Smartcheckr, BuzzFeed News obtained and confirmed a document, first reported by the Times, in which the company claimed it could provide voter ad microtargeting and “extreme opposition research” to Paul Nehlen, a white nationalist who was running on an extremist platform to fill the Wisconsin congressional seat of the departing speaker of the House, Paul Ryan.

A Smartcheckr contractor, Douglass Mackey, pitched the services to Nehlen. Mackey later became known for running the racist and highly influential Trump-boosting Twitter account Ricky Vaughn. Described by HuffPost as “Trump’s most influential white nationalist troll,” Mackey built a following of tens of thousands of users with a mix of far-right propaganda, racist tropes, and anti-Semitic cartoons. MIT’s Media Lab ranked Vaughn, who used multiple accounts to dodge several bans, as one of the top 150 influencers of the 2016 presidential election — ahead of NBC News and the Drudge Report.
The NYT article mentions the pitch to Nehlen, but the other alt-right connections -- especially the fact that notorious Twitter troll Ricky Vaughn worked for him as a contractor -- were new to me.
posted by mhum at 6:19 PM on January 23, 2020 [3 favorites]


« Older 'Do Right and Feed Everyone'   |   intersection of cyriak and that Going To The Store... Newer »


This thread has been archived and is closed to new comments