Original Sin
July 27, 2018 6:14 AM   Subscribe

Our Bodies or Ourselves - "The collection and storage of people's biometric data fundamentally changes the relationship between citizen and state. Once 'presumed innocent', we are now, in the sinister words of former UK Home Secretary Amber Rudd, 'unconvicted persons.' " (via)
posted by kliuless (13 comments total) 20 users marked this as a favorite
 
This is timely, this news just broke on my feed this morning.

Canada is using ancestry DNA websites to help it deport people [Vice News]
“In another example of the extraordinary lengths Canadian immigration officials go to deport migrants, the Canada Border Services Agency has been collecting their DNA and using ancestry websites to find and contact their distant relatives and establish their nationality.”
posted by Fizz at 6:21 AM on July 27, 2018 [7 favorites]


Re: the Amber Rudd quote, here's the full context:

“Accordingly, following consultation with key partners, the principal recommendation is to allow ‘unconvicted persons’ to apply for deletion of their custody image, with a presumption that this will be deleted unless retention is necessary for a policing purpose and there is an exceptional reason to retain it,” said the home secretary.

“In practice, this will mean that people could apply to chief officers for their image to be deleted where they have not been convicted of the offence in relation to which their image was taken.


The term was used specifically in reference to photographs taken during the booking process for a criminal offense. In that context, "unconvicted persons" is entirely appropriate, because they are literally people who were accused but not convicted of a crime.

Biometric data collection is terrifying enough on its own, there's no need to use ominous-sounding quotes out of context to drum up even more fear.
posted by grumpybear69 at 7:14 AM on July 27, 2018 [3 favorites]


The term was used specifically in reference to photographs taken during the booking process for a criminal offence in relation to which their image was taken


Hmmm - what about images/video of people collected via investigation, Britain has a very pervasive public surveillance system. What about images collected from social media, and then added to a profile of "suspects" or "unconvicted persons". What are the retention policies around those? So - this data and images are built over time and then a change in government makes something that was previously legal, illegal... Evidence exists, what happens then?

... Originally, I was excited by the post title... for a second, I thought it was related to: "Our Bodies, Ourselves, Our Cybernetic Arms" by Jonathon Coulton.

So... in summary, "big data ruins everything"...
posted by jkaczor at 7:39 AM on July 27, 2018 [3 favorites]


The term was used specifically in reference to photographs taken during the booking process for a criminal offense. In that context, "unconvicted persons" is entirely appropriate, because they are literally people who were accused but not convicted of a crime.

(a) That's not always how it works. Sometimes people are detained without being accused of a crime. This might be described as detention for breach of peace, or administrative detention, or the person might just have been swept up in a mass arrest whilst in or adjacent to a protest.

(b) Given the mass detention of protesters situation above, the language used - that the ability to request the deletion of one's profile is for people who have actually been charged with (but not convicted of) a crime - kind of implies that those who weren't ever charged don't have that right. That has negative implications for protest and grassroots political activism.

(c) Relatedly, what about the images of all of the people who cops take video of at protests, who are never detained? In addition to potentially being excluded from this privacy protection by the wording used by Rudd, these people might not even know that their privacy has been breached in this manner, which would effectively prevent them from accessing this important privacy right.

In general, there's a (qualitative and quantifiable) difference between legal or bureaucratic processes that happen automatically versus processes that require an application. Applications almost always present significant bureaucratic and other obstacles to many people (be they poor and can't afford the time or application fee, not sufficiently proficient in the official language, from a community that has reason to fear or be suspicious of unnecessary interactions with police based on past police abuses on other members of their community, of uncertain or undocumented immigration status, etc.). I don't have the data immediately to hand, but the differential effects of these two types of processes are measurable and well documented. Note also, for example, that those in power (landlords, large businesses) are always lobbying to be able to do what they want and just defend later that they have followed the law, rather than having to apply beforehand to do things.
posted by eviemath at 7:46 AM on July 27, 2018 [11 favorites]


Related - Georgia Tech, partnered with the University of Surrey in the U.K. can use AI, algorithms and big data to enable "pre-crime divisions", how could that possibly go wrong?

What are the cultural, religious, gender, community and racial biases/assumptions potentially encoded in those algorithms? Are they public so they can be reviewed?
posted by jkaczor at 7:48 AM on July 27, 2018 [4 favorites]


You know, I am going to paraphrase a quote from William Gibson....

"Dystopian/cyberpunk-based science-fiction was not supposed be a freakin "How To" manual"
posted by jkaczor at 7:51 AM on July 27, 2018 [5 favorites]


I'd be pretty well amazed if US states aren't keeping a high quality copy of a photo of everyones eyes, which they get when you show up for drivers license. One thing Snowden brought home hard -- it doesn't matter what these ppl say, they are going to do whatever it is they want.
posted by dancestoblue at 8:00 AM on July 27, 2018 [6 favorites]


Also related - "using ancestry DNA websites"...

The "Golden State" serial killer was caught in April using similar techniques...

Yes, it caught a horrible criminal, but... so many of the same people who decry "big government" intrusion into their lives happily outsource this to private, essentially unaccountable corporate "persons" for a pittance (a free app/website, silly social media).

Do you think governments were really going to ignore all this data that other entities have collected and correlated?

Interesting sci-fi book which touches on these topics - "After On".
posted by jkaczor at 8:15 AM on July 27, 2018 [5 favorites]


Dancestoblue, yup pretty much doing it even when the law has said no. It’s just so convenient. VT experience.
posted by meinvt at 11:14 AM on July 27, 2018 [1 favorite]


Alright, so these services aren't services for you and me.
23andMe has no problem selling you to big pharma:
But in case you needed another reason why voluntarily giving your DNA to companies is a bad idea, on Wednesday the genomic-ancestry company 23andMe announced it was forking over its DNA data to the world’s ninth-largest pharmaceutical company, GlaxoSmithKline (GSK).

Genetic material is being used to track down relatives to catfish and deport folks in Canada.
In another example of the extraordinary lengths Canadian immigration officials go to deport migrants, the Canada Border Services Agency has been collecting their DNA and using ancestry websites to find and contact their distant relatives and establish their nationality.

Lastly, law enforcement is utilizing this to put pressure on families in order to catch criminals.
This is a reality in a world where the alleged Golden State Killer, now known as Joseph James DeAngelo, was arrested after DNA found at one of the killer's crime scenes was checked against genetic profiles from genealogical websites that collect DNA samples.

I can sort of get behind catching criminals, but it is the inherent genetic doxxing and pressure on families to aprehend the suspects that I can't get behind. Just rotate this a bit, put it in the hands of a full on criminal... maybe I blackmail you with the threat of deportation of a relative. Maybe I make sure someone else's DNA is at the scene of the crime in an obvious manner to throw folks after you instead of doing the hard work. Maybe I develop a nerve agent that is only compatible with certain genetic signatures and release it wild with it designed only to kill very specific gene patterns (think hate groups).

This isn't a driver's licence and registration. This isn't even your credit history. This is - intrinsically who you are and who you are related to.
posted by Nanukthedog at 7:39 AM on July 28, 2018 [2 favorites]


From the deleted post

The shark has jumped, about the time that meme was filmed. Have you paid a bill or had any interaction with society in the last several decades, recorded, perm, many db's.

What we need is an international (inter planetary probably) "Bill of Rights" amendment level core right that protects individuals.

Databases are not going away, there is no tech solution that BigData or AI or something something google can not beat. Deeply established rule of law that gives individuals and groups real protections is the only defense. It's great that the EU GDPR is making an attempt but it needs to be at a deeper level. After that it'll take decades of case law to allow the right to become ingrained in society, all while the tech morphs, becoming increasingly powerful and invasive.

DNA is just one incredibly powerful detail that over time will be integral to how we know ourselves, heal, improve, verify, validate and become oppressed if protections are not given as a core right.
posted by sammyo at 8:02 AM on July 28, 2018 [2 favorites]


Artificial intelligence can predict your personality… simply by tracking your eyes

Basically we need to learn how to live with the 'AI' advancements or... there is no *or* we can not turn off science, computers will become unimaginably more powerful and the math will continue to advance.
posted by sammyo at 5:04 AM on July 29, 2018


Basically we need to learn how to live with the 'AI' advancements or... there is no *or* we can not turn off science, computers will become unimaginably more powerful and the math will continue to advance.

The thing is, we're still at the Action Park stage of AI evolution.
posted by rhizome at 11:33 AM on July 29, 2018 [1 favorite]


« Older “There’s a type of accomplishment, to grow and...   |   Devendra Banhart - live@quiet, please! Newer »


This thread has been archived and is closed to new comments