July 1, 2019 12:07 PM   Subscribe

The Strange Politics of Facial Recognition [The Atlantic] “Your face is no longer just your face—it’s been augmented. At a football game, your face is currency, used to buy food at the stadium. At the mall, it is a ledger, used to alert salespeople to your past purchases, both online and offline, and shopping preferences. At a protest, it is your arrest history. At the morgue, it is how authorities will identify your body. Facial-recognition technology stands to transform social life, tracking our every move for companies, law enforcement, and anyone else with the right tools. Lawmakers are weighing the risks versus rewards, with a recent wave of proposed regulation in Washington State, Massachusetts, Oakland, and the U.S. legislature.”

• 31 Percent of People Don't Trust Facial-Recognition Tech [PC Mag]
“In a recent PCMag survey, 31 percent of the 2,000 respondents said they don't trust facial-recognition technology. How the technology is used is a big part of that trust. When asked for instances in which it would be suitable to use facial-recognition technology, the most popular response at 37 percent was detection and prevention of crime. Another 28 percent support using facial-recognition technology in the airline industry to potentially shorten wait times and line length by checking in passengers through facial identification. Just over a quarter of respondents say its use is acceptable for access and authentication to mobile devices, and 25 percent are okay with it being used for healthcare needs. Only about 19 percent want to use it for paying bills and making purchases online. But while 37 percent of people support the use of facial-recognition tech for law enforcement purposes, San Francisco's Board of Supervisors voted to ban police and city agencies from using it. Oakland, Berkeley, and Somerville, Massachusetts, are reportedly considering the same thing.”
• San Francisco Banned Facial Recognition. Will California Follow? [The New York Times]
“When San Francisco banned the use of facial recognition by the city’s police and other agencies earlier this year, it was an outlier in the United States. But now several other cities are following suit, and California is considering a limited ban on the technology. Somerville, a city near Cambridge, Mass., passed a facial recognition ban last week. Oakland, San Francisco’s neighbor across the Bay, is on the verge of passing its own measure, which would prohibit the police and other city agencies from deploying the technology. And a bill in the California State Legislature that would ban the use of facial recognition on footage collected by police body cameras appears to be gaining traction. The legislative forays indicate a groundswell of support for curtailing the technology, which has struggled to correctly identify women and people of color. The error rates and the pervasive, passive surveillance that facial recognition can enable are often cited as concerns motivating the bans.”
• Amazon's Facial Analysis Program Is Building A Dystopic Future For Trans And Nonbinary People [Jezabel]
“Though our data sets were admittedly limited, the difference in how Rekognition performs on a data with trans and nonbinary individuals is alarming. More concerning however is that Rekognition misgendered 100% of explicitly nonbinary individuals in the Broadly dataset. This isn’t because of bad training data or a technical oversight, but a failure in engineering vocabulary to address the population. That their software isn’t built with the capacity or vocabulary to treat gender as anything but binary suggests that Amazon’s engineers, for whatever reason, failed to see an entire population of humans as worthy of recognition. [...] “It’s obviously disconcerting,” they told us. “When you talk about practical applications and using this technology with law enforcement, that feels like a dangerous precedent to set. But at the same time, I’m not surprised. We’re no longer shocked by people’s inability to recognize that nonbinary identity exists and that we don’t have to conform to some sort of category, and I don’t think people are there yet.””
• Facial recognition technology may be coming to porn – and these men can’t wait [New Statesman]
“ The innovation that has excited these men? A tool that they hope will resolve their romantic paranoia: facial recognition software designed for porn, built to search for women’s faces. Interest in the new tool is based largely on a tweet from a statistician called Yiqin Fu. In a now-viral post she states that a “programmer said he and some friends have identified 100k porn actresses from around the world”. A new programme “cross-references faces in porn videos with social media profile pictures. The goal is to help others check whether their girlfriends ever acted in those films.” [...] Even when gently pressed, the men I contacted could not appreciate the mammoth logical leaps they’d made in presuming that their unresponsive or uninterested girlfriends must be doing porn. Many complained of unanswered calls and low-libido partners. Never mind being busy or tired: they are, of course, shooting smut.”
• Taser maker says it won’t use facial recognition in bodycams [Ars Technica]
“Axon, creator of the Taser, did something unusual for a technology company last year. The Arizona corporation convened an ethics board of external experts to offer guidance on potential downsides of its technology. Thursday, that group published a report recommending that the company not deploy facial recognition technology on its body cameras, widely used by US police departments. The report said the technology was too unreliable and could exacerbate existing inequities in policing, for example by penalizing black or LBGTQ communities. Axon CEO and founder Rick Smith agrees. “This recommendation is quite reasonable,” he says in an interview. “Without this ethics board we may have moved forward before we really understood what could go wrong with this technology.” The decision shows how facial recognition technology—while not new—has become highly controversial as it becomes more widely used. The power that software capable of recognizing people in public could give police and governments has struck a nerve with citizens and lawmakers seemingly inured to technology that redefines privacy.”
• Watchdog criticises 'chaotic' police use of facial recognition [The Guardian]
“Police forces are pushing ahead with the use of facial recognition systems in the absence of clear laws on whether, when or how the technology should be employed, a watchdog has said. Prof Paul Wiles, the biometrics commissioner, said in his annual report that police deployment of the technology, which can be used to scan crowds or CCTV recordings for people of interest, was chaotic and had run ahead of laws that could prevent its misuse. With no legal framework in place it was left to the police to decide when the public benefit outweighed the “significant intrusion into an individual’s privacy” arising from facial recognition and other types of biometric identification, the report said. It said guidance from the National Police Chiefs’ Council would help, but it noted that the Met commissioner, Cressida Dick, said last year that the police should never be the ones to judge where the balance should lie between security and privacy.”
posted by Fizz (14 comments total) 30 users marked this as a favorite
None of this is science fiction. This is not a satirical short story. This is 2019.
posted by Fizz at 12:14 PM on July 1, 2019 [8 favorites]

Amazon's Facial Analysis Program Is Building A Dystopic Future For Trans And Nonbinary People

Facial Recognition Is Accurate, if You’re a White Guy (Steve Lohr for NY Times, Feb. 9, 2018)
When the person in the photo is a white man, the software is right 99 percent of the time.

But the darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.
But it's been over a year, so consider those bugs squashed! (hah.) Also, that report is far more positive than this one: British police facial recognition tech is 96 percent inaccurate (Mario McKellop for The Burn In, May 17, 2019)
British law enforcement hoped integrating facial recognition technology into existing closed-circuit surveillance networks would make catching criminals easier. As such, Scotland Yard spent nearly $300,000 testing face scanners on the public. The organization’s experiment was not a success, to put it lightly.

Between 2016 and 2018, the agency conducted eight biometric scanning trials in London. Results showed the UK’s facial recognition technology produced inaccurate results 96 percent of the time. In fact, the artificial intelligence enabled software incorrectly matched live video images of British citizens with criminal registry databases.
CV Dazzle: Camouflage from Facial Recognition (previously, twice)
posted by filthy light thief at 12:16 PM on July 1, 2019 [6 favorites]

I say this in all seriousness; We're doomed.
posted by bongo_x at 1:39 PM on July 1, 2019 [3 favorites]

I need to read the links still, but I was thinking about all of this recently because I've been reading a lot of postmodern philosophy and writing a lot lately. Through my own thoughts about my own gender (or lack of gender?) it's becoming increasingly obvious to me that the "future" (in so much that we are currently living in it/this cyberpunk dystopia we're going through) is going to be non-gendered, especially as signifiers break down and human society as a whole goes through some sort of cosmic reckoning with global warming and the potential destruction of the world vs. the dissolution of capitalism. How are the tech overlords and surveillance states going to be able to do facial recognition when the face may no longer be a coherent signifier of a human being's identity? With plastic surgery and body modification becoming cheaper and more widespread people will be able to change their looks, and their identities, much more easily. Now, the face isn't all a person's identity, but the face is definitely a major signifier for "who" a person "is" when you see them out in public, and obviously when you see them on social media. I mean, the entire concept of "this is who I am" as a human being has been obfuscated by the internet, and it seems like that is seeping into "reality" more and more. The ability to adopt your own identity, or to rectify what your true "self" is, is becoming more accessible.

There are these shirts that say "the future is female", and while I partially enjoy (and yet, totally reject the notion of) my body being a "feminized" construct, versus the mind being considered a masculine one, I constantly find myself rejecting that entire binary, simply due to the fact that I never consented to being "male" in the first place. It's total bullshit, and yet also super exciting from a transgressive and subversive and sort of "hacker" point of view, that facial recognition and the companies behind the technology would even adopt only a binary through which they view human beings. And then how is this going to interact with other signifiers in capitalism? Man vs. Woman: age, income, biology, physiological information, psychological profiles, etc. All of capitalist society wants to force people into chains of signifiers, to force people into identity politics, and not pejoratively because identity politics are important, but as a tool to control and diminutize us as human beings with identities that we chose for ourselves, instead of the identities that they decided for us. Obviously it's all a system of control, and yet we seem to be getting more tools to help us deny that.

Will it actually be possible for facial recognition to someday figure out who I am as "gucci mane" one day when I walk into the supermarket, contrasted to if I walk in a week later looking entirely differently due to body modification, surgery, and potentially designer drugs?
posted by gucci mane at 2:11 PM on July 1, 2019 [3 favorites]

My post doesn't even get into facial recognition as a tool of the state-apparatus, and even though I mostly spoke on tech companies, the difference is becoming blurrier and blurrier. When I go to rallies and protests and counter-protests against right-wing groups, them and the police all have cameras trained on particular people, and the cops definitely have a database of people they consider targets (Portland police have shown up to protests and snatched specific people and driven away with them, so it's definitely in use). A toolkit for anti-capitalists to use against these threats would be potentially devastating.

And is it possible now, or in the future, to real-time hack into a device that is using facial recognition technology and obscure the data? Can I put an avatar over my face in real time, so that when some Proud Boy is filming me they actually can't tell who I am?
posted by gucci mane at 2:16 PM on July 1, 2019 [1 favorite]

It's such a god damn mess. Even if you just give-in and accept that we live in a surveillance state and we're constantly being tracked, the other half of this nightmare situation is that the AI tools and facial recognition software that the state-apparatus is using to track me is bad and can maybe fuck up my life even more because I don't fit into some kind of pre-coded filter that was built with all kinds of biases and privileges that protects certain groups/individuals.

posted by Fizz at 2:24 PM on July 1, 2019 [5 favorites]

This is inevitable. There are cameras pointing in every direction in every traffic intersection in the Bay Area. They are ubiquitous in stores and street corners. I've been wondering how long it will take facial recognition and other bio scan software to scan and flag like the SciFi movies always portray. It probably depends on the quality/size/access to the database.

I am sure that facial recognition will only be used to catch dangerous criminals, kidnappers, terrorists, etc. by "authorized" government agencies. The tech will never allow database access to corporations, black markets, various rogue/bad parties, etc.
posted by CrowGoat at 2:36 PM on July 1, 2019

"My face is my backstage pass,
Like the Queen, you don't need cash,
You won't see me with an empty glass,
I got the future, got the pass!"

We are all Blaze Bayley now.
posted by I'm always feeling, Blue at 2:39 PM on July 1, 2019

On the plus side: it's a lot less painful than the brands and ear-tags other livestock have had to endure for so long.
posted by Twang at 4:11 PM on July 1, 2019 [2 favorites]

I say this in all seriousness; We're doomed

Never tell me the odds!... of being accurately identified by dystopian ubiquitous surveillance.
posted by justsomebodythatyouusedtoknow at 7:29 AM on July 2, 2019

Don't worry, no matter how effectively dazzle camouflage stumps facial recognition, they're hard at work on laser heartbeat recognition, which already works through clothes at a distance of up to 200 meters. I'm not even sure how you'd go about changing or disguising your heartbeat.
posted by Copronymus at 10:45 AM on July 3, 2019

Atrial fibrillation would likely defeat that device, but the bigger problem is a surveillance technique that easily detects people walking around with medical conditions such as AFib, heart valve problems, and pacemakers. It's kinda like Alexa or your cell phone being able to detect a heart attack from your breathing patterns, there are beneficial uses but wow for the privacy implications.
posted by peeedro at 11:10 AM on July 3, 2019

« Older I thought I made a hard game and then speedrunners...   |   Why San Francisco Techies Hate the City They... Newer »

This thread has been archived and is closed to new comments