AI powered X-ray glasses
June 29, 2019 6:53 AM   Subscribe

 
Don't worry, there will be plenty of other horrifying uses for deepfakes other than rampant, weaponized misogyny!
posted by lalochezia at 7:00 AM on June 29, 2019 [14 favorites]


This is life threatening for women in many parts of the world. Not just objectifying or invading privacy, women will die because of this. Fucked up lose-lose, not sure if the technology is worse or the fact that women's "honour" can be a life threatening circumstance.
posted by Meatbomb at 7:41 AM on June 29, 2019 [34 favorites]


I was ready to stomp all over this post because the original Vice article was horrific. I'm glad they toned it down, but to say that it was an error rather than an excuse to get clicks is just dishonest.
posted by urbanwhaleshark at 8:09 AM on June 29, 2019 [1 favorite]


It’s so misogynistic it’s almost a parody of itself.

“The app does not work on men and when presented with a picture of a man simply adds a Vula”
posted by Faintdreams at 8:15 AM on June 29, 2019 [12 favorites]


"Driven by fun and enthusiasm for that discovery, I did my first tests, obtaining interesting results."

Narrator: He's saying he had a stupid boner doing the thinking for him.
posted by loquacious at 8:18 AM on June 29, 2019 [5 favorites]


I think this could be one place where the POTUS had provided leadership - if confronted with something that was intended to be humiliating, shout "fake nudes!" and get back to business.

Oh wait, the version that works on pictures of men isn't out yet - guess this solution will have to wait.
posted by pulposus at 8:22 AM on June 29, 2019 [1 favorite]


The way that Vice handled this is illuminating.

We're never going to stop this from happening. The best that we can do is reduce how widespread it is. It needs to be taken down by app stores - and not just voluntarily, it needs to be enforced. We need to think about what kind of legal protection will help people get the photos taken down as well .

And, importantly, men need to learn that this is not okay. And they need to learn it from other men, because they dismiss women who disapprove of it as harpies and prudes.

Vice paid lip service to the idea that it's "horrifying" but it's obvious they didn't really think that because they paid for the app, used it to make photos of non-consenting women, and then posted them online. They don't really think that it's horrifying, just that it's politically correct to disapprove and then - ignore, when it suits their purposes (or their boners).

They really do not appreciate the effects this can have, or they do not care. Too many men are like this.
posted by Kutsuwamushi at 8:45 AM on June 29, 2019 [27 favorites]


Deepfake tech should be straight up illegal or at least require strict federally overseen licensure. It is very powerful and dangerous and should absolutely not be in the hands of the general public. Even photoshopping a fake nude should be punishable by prison time.
posted by grumpybear69 at 9:17 AM on June 29, 2019 [6 favorites]


There’s not really any underlying technology that can reasonably be banned. The tools are very general purpose; it’s just a bunch of simple mathematical operations packaged in a way that makes them easy to use together.

Stricter controls on the use of people’s likenesses are probably going to be necessary though but that doesn’t come down to any particular technology.
posted by vogon_poet at 9:24 AM on June 29, 2019 [15 favorites]


"...and then all her clothes fell off."

If you imagine all the hideous indescribable things neural networks will soon be capable of instantly doing to our photographs, removing the clothing seems a tame and uninspired by comparison.
posted by sfenders at 9:40 AM on June 29, 2019


Deepfake tech should be straight up illegal or at least require strict federally overseen licensure. It is very powerful and dangerous and should absolutely not be in the hands of the general public. Even photoshopping a fake nude should be punishable by prison time.
I cannot see any problems with granting the government dictatorial control over art and other forms of expression.

If someone publishes some deep fake that is shown to be fake, there are already remedies in the law for that. Having another way for the federal government to jail people whose expression it finds distasteful is way way way scarier to me than a bunch of private liars, in much the same way that the PATRIOT act is way scarier than some terrorists.
posted by Gilgamesh's Chauffeur at 9:45 AM on June 29, 2019 [27 favorites]


Don't worry, there will be plenty of other horrifying uses for deepfakes other than rampant, weaponized misogyny!

I, too get really angry when a headline pretends that the purpose of something is actually something it wasn't. A technology can have multiple not good effects, and a version of feminism that can't handle that is a version of feminism that cannot be intersectional.
posted by Going To Maine at 9:46 AM on June 29, 2019 [4 favorites]


Mod note: One deleted. Ironic misogyny, racism, etc are not ok here.
posted by LobsterMitten (staff) at 9:56 AM on June 29, 2019 [2 favorites]


I cannot see any problems with granting the government dictatorial control over art and other forms of expression.

aka FREEDOM OF SPEEEEEEEEEEECH
posted by grumpybear69 at 10:10 AM on June 29, 2019


If someone publishes some deep fake that is shown to be fake, there are already remedies in the law for that.

Are there? What are they, exactly? How does the law distinguish between a deepnudes fake and a hard-copy photograph you've drawn a mustache on with a sharpie?
posted by sfenders at 10:10 AM on June 29, 2019 [1 favorite]


And, importantly, men need to learn that this is not okay. And they need to learn it from other men, because they dismiss women who disapprove of it as harpies and prudes.

I would be ok with them learning from women who happened to be enforcing legislation that punished these sexual assault by proxy crimes in proportion to the damage they do to the victims

If someone publishes some deep fake that is shown to be fake, there are already remedies in the law for that.

No there fucking aren’t, and if you’d bothered to even just google the issue, that would be apparent to you. Due to jurisdictional issues, resource issues, and the fact that men in law enforcement simply do not give a shit about women, most crimes against women are basically decriminalized. Men can do almost fucking anything to us, to many of us at a time, for years, and be safe and secure knowing they are almost certain to never face a single fucking consequence for even one act of sexualized misogynist terror.
posted by schadenfrau at 10:20 AM on June 29, 2019 [70 favorites]


In seriousness, the real harm is the ability to do these sort of image manipulations at scale and with almost no skill required. That is distinct from limiting artistic expression, even though I would argue that potential for art is not a strong defense against the immense harm potential, already being realized, from this tech.
posted by grumpybear69 at 10:27 AM on June 29, 2019 [2 favorites]


We jail people for forging signatures. If our faces, bodies, voices etc are being used for recognitional purpose is to identify us to airlines, banks, etc, then I don't see any reason why we can't criminalize deepfakes along the same lines has forged signatures or any other form of identity theft.

Similarly we also prosecute people for revenge porn, and that includes distributing stolen nude photographs of us. I again don't see any reason why deepfakes can't be prosecuted similarly. The whole idea is that they are convincing fakes, therefore the motive for using them is baked into the very technology.
posted by UltraMorgnus at 10:36 AM on June 29, 2019 [27 favorites]


Which isn't to diminish the alarm. This technology is absolutely terrifying; particularly in how it victimizes women
posted by UltraMorgnus at 10:38 AM on June 29, 2019 [4 favorites]


If everyone has a fake nude done of them, then that actually takes some of the power away from those interested in revenge porn. Right now if someone makes one of these, the victim has to prove that its a fake. When (and it will happen, in fact I would predict that someone will make a bot that skims do every available image male and female and does this), it will be likely that most people assume that real pics are deep faked.
An interesting competing product would be one that adds clothes to nude pics, so that victims of revenge porn could claim that the real pics had gone through this nudifing process.
posted by 445supermag at 11:01 AM on June 29, 2019 [1 favorite]


Creating fake nudes of someone is certainly misogynistic, cruel, and morally wrong, but it does not physically harm anyone. I don't think it's a good idea to equate hurt feelings with physical abuse. It minimizes actual violence.

It is possible to concoct a scenario where someone uses fake nudes to indirectly cause violence to be done, but you can do the same with any other form of expression, including plain old written or verbal speech.
posted by foobaz at 11:02 AM on June 29, 2019 [3 favorites]


I cannot see any problems with granting the government dictatorial control over art and other forms of expression.

When do you think that fake nudes of non-consenting adults are justified by art and other forms of expression?

Having another way for the federal government to jail people whose expression it finds distasteful

You're calling it "distasteful" as if the problem is just that it's vulgar - and not the harm that it can do. This is an indirect example of what I meant when I said women who object to this technology will be seen as harpies and prudes. It reduces a concern about the freedom and safety of women to a matter of taste.

I've noticed men using this kind of minimizing language a lot lately. Just yesterday, I got into a discussion with a man on Reddit about representation of female characters. (I know it was a bad idea.) I was explaining why representation matters; he kept referring to my opinions as "preferences" and "dislikes", even after I told him to stop.

If someone publishes some deep fake that is shown to be fake, there are already remedies in the law for that.

I do not think you are as knowledgeable on this topic as you think you are.
posted by Kutsuwamushi at 11:02 AM on June 29, 2019 [22 favorites]


The tension that always comes up with "is this illegal" is whether people are going to get away with it because it isn't a crime, or because women are deprioritized in patriarchal society. The odds are that the laws exist, and I would really just like to see more women in the legal system since they will give a care while men in general continue to cede no ground. A new, specific law would perhaps not be unhelpful, though, and would encourage people to hop to it.
posted by Going To Maine at 11:03 AM on June 29, 2019 [1 favorite]


We're never going to stop this from happening. The best that we can do is reduce how widespread it is. It needs to be taken down by app stores - and not just voluntarily, it needs to be enforced. We need to think about what kind of legal protection will help people get the photos taken down as well .

The twitter post where he talks about taking it down is full of people who already downloaded the app offering to sell it to others.

Creating fake nudes of someone is certainly misogynistic, cruel, and morally wrong, but it does not physically harm anyone. I don't think it's a good idea to equate hurt feelings with physical abuse. It minimizes actual violence.

Bless your heart.
posted by EmpressCallipygos at 11:09 AM on June 29, 2019 [42 favorites]


Not causing *physical* harm is a low freakin’ bar to apply to potentially life ruining tech and discussions around the legalities of such technology
posted by Faintdreams at 11:18 AM on June 29, 2019 [17 favorites]


Mod note: One deleted. This will go a lot better if people can avoid making comments that suggest this isn't harmful (or isn't like, really harmful in a real way that people might actually just think is a problem for the reasons they're saying so there must be some other motivation for bringing it up); or that because there are other applications people should stop talking about this one.
posted by LobsterMitten (staff) at 11:37 AM on June 29, 2019 [4 favorites]


From the article:
“... the anonymous creator of DeepNude, who requested to go by the name Alberto ...”

I really shouldn’t be surprised that the man going on record to discuss/promote the app he created with the explicit intent to render realistic nudes of unconsenting individuals feels perfectly entitled to keep his own identity a secret.

And yet, here we are.
posted by myotahapea at 11:41 AM on June 29, 2019 [54 favorites]


If someone publishes some deep fake that is shown to be fake, there are already remedies in the law for that.

Remedies that don't immediately serve to give the deepfake even more attention and distribution?
posted by Thorzdad at 11:48 AM on June 29, 2019 [3 favorites]


This isn't even deepfakes (a complicated video mapping software), this is just convenient Photoshop, old-fashion clothes removal in the convenience of a mobile phone.

I did paid photoshop work when I was in HS and College and it was mostly "remove all blemishes, remove all body hair, breasts higher/bigger, erase nipples in case of sheer clothes" for magazines and catalogs. I worked from home for a tech start up and PS didn't have all the tools it has now so custom mods got passed around for this kind of touch up (slimming and enlarging tools especially) in the 20 years since these tools not only became part of standard photoshop but in tons of mobile apps as well.

Back then we'd occasionally get commissions to do weird stuff like remove more clothing than typical (bra strap removal or clothes lines, or modesty patches was par for the course) but sometimes it was to essentially make people more naked than they were. Often removing nude clothes to make "nude covers" from a non nude shoot.

20+ years later I look back and wonder how many of those models/people knew and had consented to all that work we were doing?

There are certainly plenty of apps out there that can remove clothes with very little effort , but still take some know how. Making it one click takes it from rarity to ubiquity.
posted by French Fry at 12:07 PM on June 29, 2019 [3 favorites]


"Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public."

As French Fry suggests, this is basically automated photoshoppery.

Fake nudes have been available to the public for a long time. They just haven’t been as easy to generate as this. A decent photoshopper can generate a fake nude that is surprisingly realistic, and then by passing it through a couple of external filters (I.e., taking a secondary photo of the altered photo, and then cleaning up the secondary photo) eliminate any pixel artifacts that would indicate it’s a fake.

This is especially a thing in gay porn of celebrities. There are some incredibly realistic fake nudes of Jake Gyllenhaal, Brad Pitt, Ryan Gosling, etc.
posted by darkstar at 12:09 PM on June 29, 2019 [4 favorites]


remedies in the law

For these remedies to be useful for the purposes these technologies have and will continue to be used for, at least two things need to change: first, the establishment of copyright protection for one's own physical self and image, and second, companies (and the individuals behind them) that enable the creation, hosting, and sharing of such images need to be made much more vulnerable to prosecution.
posted by notquitemaryann at 12:21 PM on June 29, 2019


The lack of practical recourse is nearly non-existent and for a phenomenon that can disproportionately cost women their jobs and security that's a pretty sad state of affairs.
posted by French Fry at 12:26 PM on June 29, 2019 [8 favorites]



The fact that reshape, resize, stamp/sample, and skin tone tools in PS have found their way onto dozens of free phone apps is terrifying to me. Those are all you need to really convincingly spoof something especially if it only needs to pass muster on a phone screen.

I shudder to think what 12-17 yo me would have done with such technology attached to a phone, in my pocket.

Apps like this are just the next step of removing the literal minutes (3-10) of work this currently takes. And that lower bar will make this far worse.
posted by French Fry at 12:35 PM on June 29, 2019


This isn't even deepfakes (a complicated video mapping software), this is just convenient Photoshop, old-fashion clothes removal in the convenience of a mobile phone.

Not really. Here's an interactive demo and a 4 minute presentation on how the underlying software library works when trained on images of kittens, buildings, shoes, or handbags.

Basically it takes a real input image and generates a fake output image based on the data the machine learning tool was trained on. In the case of DeepNude the input image is the user-supplied image of a woman, the output image is a that woman with the AI's best guess at the areas covered by clothes replaced by automatically generated nude female body parts, and the data it was trained on is a large collection of female nudes.
posted by peeedro at 12:36 PM on June 29, 2019 [2 favorites]


We're never going to stop this from happening. The best that we can do is reduce how widespread it is. It needs to be taken down by app stores - and not just voluntarily, it needs to be enforced.
...should be straight up illegal or at least require strict federally overseen licensure. It is very powerful and dangerous and should absolutely not be in the hands of the general public.
sometimes it's easier to just say "we can be pretty sure no one is cool with this, and there's no benefit, so fuck it"
I am having flashbacks to the 90s discussions of PGP and similar powerful encryption technologies.
posted by Hatashran at 1:16 PM on June 29, 2019 [5 favorites]


90s discussions of PGP and similar powerful encryption technologies.

See also last year, and this week.
posted by sfenders at 1:26 PM on June 29, 2019 [2 favorites]


We are talking about apps whose sole purpose is to create fake nudes, something which will be undoubtedly be wielded to silence, humiliate, and punish women. It does not have a compelling legitimate use. Its purpose is to violate privacy, not to protect it.

We are not talking about encryption.

It's an extremely poor comparison that casts all calls to regulate technology as being made by fearful luddites. It's facile.
posted by Kutsuwamushi at 1:31 PM on June 29, 2019 [40 favorites]


Computer image manipulation technology is just going to continue to get better and better, until anyone can make an image (or video) of anything they can imagine.

Attempting to make the technology illegal is a non-starter in sooo many ways.

I can foresee laws enhancing civil penalties, but given the cost of prosecuting a lawsuit, it will likely be reserved for only the most egregious offenses.

Ultimately - and I mean over a span of 25+ years - I believe the problem will be resolved as a) people become inured to the existence of embarrassing imagery that is almost certainly faked, and b) we develop strong authentication technology to establish the credibility of images and video when authenticity is important (ex: in court).

I’ve seen “celebrity porn” that was very cleverly executed - but even today, very, very few people are going to believe that Jennifer Anniston really posed for that picture.

I suspect the true issue at the heart of this is that people fear their pervert neighbor will be jerkin’ it to homemade deepfake porn based on their face and body. But pervert neighbors have been doing this stuff for years already just making up the imagery in their heads. It’s not really a technology problem - it’s a pervert neighbor problem.
posted by doctor tough love at 1:49 PM on June 29, 2019 [2 favorites]


No comparison intended, and I wasn't intending to imply that this is on the same moral plane as PGP. Just that I doubt that attempts to regulate this technology will be any more successful than any other attempt to regulate any other technology. And that the arguments were giving me flashbacks.
posted by Hatashran at 1:51 PM on June 29, 2019 [1 favorite]


I suspect the true issue at the heart of this is that people fear their pervert neighbor will be jerkin’ it to homemade deepfake porn based on their face and body.

No, it is not.
posted by Kutsuwamushi at 2:28 PM on June 29, 2019 [23 favorites]


I cannot see any problems with granting the government dictatorial control over art and other forms of expression.

When do you think that fake nudes of non-consenting adults are justified by art and other forms of expression?
The message I commented on called for federal licensure and control of "deepfake technology"...like Photoshop. That seems to me to be a vastly worse solution than the problem it attempts to solve.

Having another way for the federal government to jail people whose expression it finds distasteful


You're calling it "distasteful" as if the problem is just that it's vulgar - and not the harm that it can do. This is an indirect example of what I meant when I said women who object to this technology will be seen as harpies and prudes. It reduces a concern about the freedom and safety of women to a matter of taste.
I must have expressed myself poorly. You think I'm talking about the nudie app, but I meant to say such a law could be applied to anything else the government doesn't like. If you give them a weapon to prevent a thing you don't like, it may eventually be used against things and people you never intended.
posted by Gilgamesh's Chauffeur at 3:08 PM on June 29, 2019 [2 favorites]


I’ve seen “celebrity porn” that was very cleverly executed - but even today, very, very few people are going to believe that Jennifer Anniston really posed for that picture.

It only takes - to give just one hypothetical example - one outraged parent seeing a deep fake of their child’s school teacher on their child’s phone to launch a career-ruining campaign against that teacher with very real consequences.
posted by andraste at 3:14 PM on June 29, 2019 [19 favorites]


Creating fake nudes of someone is certainly misogynistic, cruel, and morally wrong, but it does not physically harm anyone. I don't think it's a good idea to equate hurt feelings with physical abuse. It minimizes actual violence.

Once again, please do the bare minimum of reading like, one fucking article about how these harassment campaigns etc actually effect the women who are targeted before you come into a thread bloviating about things you’ve apparently thought about for all of 30 seconds, because I can assure you the resultant PTSD is a very real consequence, and that is just for starters

It genuinely amazes me how people feel entitled to talk about women’s lives without knowing a single goddamn thing, and yet, clearly, it shouldn’t
posted by schadenfrau at 3:27 PM on June 29, 2019 [41 favorites]


Well, I'm almost to the point where I want to start the Butlerian Jihad a few millennia early. Who's with me?
posted by Defective_Monk at 4:01 PM on June 29, 2019 [11 favorites]


This app is horrifying, and I'm desperately relieved they've taken it down. I'm an english middle aged man, and it fills *me* with dread what such easy-to-use tech can be used for, I can't even imagine how it must feel for women already under constant threat from the weaponised misogyny deployed so often and easily online and off, when no-one in authority gives a shit or is part of the problem in the first place.

But banning the tech won't work. Whether it's export of encryption libraries, anti-DRM tools, piracy in general or the latest pressure on end-to-end encryption, every war on maths in the modern era has failed, because of the very nature of the internet. Once that genie is out of the bottle, it's out. Any definition banning it symbolically would just ban countless tools that do have worthwhile uses - machine learning is a rapidly growing part of computer science and is being used in a huge number of fields such as medicine, education, finance, aviation, even farming - not just deep fakes and privacy invasion by social media companies. This specific example is taking techniques that have existed almost as long as cameras have (many of the terms in photoshop originate from film negative alterations) and making them easier and faster to achieve. You'd sacrifice legitimate uses (of machine learning, not deep fakes) without stopping the bad in the slightest.

Far better to go after malicious use of the technology; and I think fraud is a good classification. We've made all sorts of identity theft and misrepresentation illegal without banning the gneral tools, like photocopiers or printers. And I'd certainly shed no tears over specific examples with no redeeming value - like this app - getting blocked by the big platforms to at least make it a little harder to obtain.

The bigger problem is getting larger society to take it seriously, to make this sort of anti-women use widely seen as abhorrent as it really is, instead of some sort of jokey fun toy for men to play with, and having prosecutions actually happen for producing these images by the US justice system that certainly seems entirely hostile to women from this side of the pond (not that the UK one is much better). I don't even know where to begin with that one, it feels for every step forward the right wing takes us two back.
posted by Absolutely No You-Know-What at 4:48 PM on June 29, 2019 [3 favorites]


Just because there are ways a law criminalizing the creation and/or use of fake nudes and other deepfaked imagery or video in a way that causes harm to others or society generally could be written problematically does not mean that all such laws are necessarily problematic either in practical enforcement or Constitutionality.

At a minimum, the distribution of such creations without the consent of the subject should be a crime. As others have pointed out such laws exist and are enforceable (if not actually enforced) against revenge porn. If this kind of thing does not already fall under those laws, where they exist, they can and should be extended to protect people from fake porn. That much shouldn't merit even a raised eyebrow from the most staunch free speech advocate.

As far as the tools go, there is some reasonable concern about ensuring that any such laws are not written in an overbroad way such that they reach well beyond their intended sphere, but at this early stage and in the present context it's not particularly relevant. This thread is certainly not a drafting committee, so refer back to paragraph 1.

The point of all that is to say that I think we all agree that tools like this need to be prevented from being turned into a weapon and marketed to people as a toy, which is pretty much what this app does. If you still don't get it, think of Jarts, only with a much longer range and a slightly less instant impact. Jarts maimed and killed people. Abuse also maims and kills people. That it is more like killing someone with a trap gun than executing them directly makes it no less important. Attenuated (in the temporal sense) harm is still harm.
posted by wierdo at 5:56 PM on June 29, 2019 [2 favorites]


It only takes - to give just one hypothetical example - one outraged parent seeing a deep fake of their child’s school teacher on their child’s phone to launch a career-ruining campaign against that teacher with very real consequences.
This is a great example which the I-thought-deeply-about-this-for-as-long-as-it-takes-to-click-reply dudes should consider before downplaying it: some of this may be kept private but the part which matters will be deliberately targeted to maximize the damage inflicted. I know teachers who’ve had to deal with false claims made by angry students — now imagine that with “proof” which will require expensive forensics or a rock-solid alibi to disprove (hope they didn’t check your public schedule before faking the time…). Repeat for someone whose abusive ex tried to get custody, lost a big job / sale, had a marriage fall through (consider how many women live in cultures / religions where this would be ostracism with a chance of assault/death), etc. So much of that can’t be rolled back easily even if the fake is eventually revealed: a bullied kid who commits suicide can never recover, and a Trump who seeds the Internet with fakes of their opponent probably won’t be removed from office unless they’re really careless.

One of the more insidious aspects is that you can’t easily or completely undo it. The victim will like have to spend huge amounts of time and energy on the initial attack but even if that goes well some people will never accept it, and the victim will be spending the rest of their life wondering whether the person they just talked to is thinking about the attack and whether they still think it was real.

I like the angle of treating this as a form of identity theft but I wonder whether it might be necessary to have specific laws targeting the tools which make this easy since many of the cases would be hard enough to show damages serious enough to be the rare exception which gets prosecuted. I don’t love that idea but it seems like, say, keeping it off of major app stores would help a substantial number of people.
posted by adamsc at 7:07 PM on June 29, 2019 [18 favorites]


Thank you, adamsc.

I think it's also important to consider that these will be used as a form of harassment. Any outspoken woman on Twitter will be at risk of being flooded with faked nudes of herself. The easier it is to do, the more men will do it, the more women will be affected, and the more difficult it will be to control. The ease is why the app is concerning to me, even though fake porn has been around a while.

Sexualized harassment is extremely common - sex is the first weapon many men think to use. They think that nudity and sex degrades women and so they use nudity and sex to humiliate. And then, even if you're the most confident, empowered woman in the world, you have to deal with the fact that the world agrees with them. This can have consequences for how people perceive and treat you even if it's known to be a fake.

It is possibly helpful to think of it as another type of hate speech. Throwing up fake nudes of woman can have similar motivations and effects as flooding her with sexualized insults, but it's worse because it is also so invasive and can be used as evidence against her by people outside of the exchange.

I am so contemptuous of the men in this thread who seem ignorant of this but still think that they are qualified to comment about how it is just about "hurt feelings" or squeamishness that some man will imagine you naked.
posted by Kutsuwamushi at 7:36 PM on June 29, 2019 [37 favorites]


I am so contemptuous of the men in this thread who seem ignorant of this but still think that they are qualified to comment about how it is just about "hurt feelings" or squeamishness that some man will imagine you naked.
x100

This is the kind of thread where I wish it was easy to tag users with relevant little descriptions next to their usernames to better know to skip their comments permanently in the future. Reddit's RES for metafilter, anyone?
posted by Cozybee at 10:12 PM on June 29, 2019 [17 favorites]


I agree, Cozybee. It is extremely telling that some users have no idea how this will be weaponized against women and want to reassure us that it’s okay because it’s not really violence, what’s the big deal, we could all do this in Photoshop.

I can’t register my disgust with those replies more strongly. Thank you to Kutsuwamushi and others for articulating just how harmful it is and how bullshit it is to dismiss how this will be and already is used to terrorize women.
posted by the thorn bushes have roses at 10:50 PM on June 29, 2019 [19 favorites]


Stricter controls on the use of people’s likenesses are probably going to be necessary though but that doesn’t come down to any particular technology.

< /sarcasm>
PUBLIC NOTICE: Any and all images of myself are covered by copyright, and any use without explicit licensing is in violation of copyright law.

There. That'll fix everything.

< /sarcasm>
posted by mikelieman at 5:26 AM on June 30, 2019


Men rush to make this about anything other than it is because they don’t want to think about their own culpability. I mean, encryption? What the fuck? The technology is inevitable, so oh well? Fucking biological weapons are inevitable too but we have rules for those, don’t we.

This isn’t about technology. It’s about an ongoing war to control women’s lives and bodies, and to destroy them when they can’t be controlled. It is inherently political. Just as the way we treat rape is inherently political, just as the way we treat “domestic violence” (which probably should just be called torture, no?) is political, just as the way we treat abortion is political, just as the way we refuse to deal with crimes against women is political. These are all things which, at their core, are about controlling and oppressing women.

we have names for this when this is done against an ethnic group or a nation. They’re considered crimes against humanity. The only reason we don’t call it that when it comes to women is because of the misogyny. Because it is so widespread. Because it is here, even in this thread. But make no mistake — these are still crimes against humanity.

And people want to debate what’s real violence. Or just shrug, and say it’s like encryption.
posted by schadenfrau at 7:09 AM on June 30, 2019 [22 favorites]


Vice paid lip service to the idea that it's "horrifying" but it's obvious they didn't really think that because they paid for the app, used it to make photos of non-consenting women, and then posted them online. They don't really think that it's horrifying, just that it's politically correct to disapprove and then - ignore, when it suits their purposes (or their boners).

They really do not appreciate the effects this can have, or they do not care. Too many men are like this.


The journalist credited with this piece is named Samantha Cole. A click on Cole's name reveals a number of articles published recently on Vice on the topics of deep fakes and revenge porn. A second click on Cole's Twitter account reveals that Samantha Cole appears to be female-presenting and has been tweeting a lot about these issues recently. I spent approximately 30 seconds finding this information, and from it, it seems far more likely to me that Cole is trying to draw attention to a genuinely troubling phenomenon, and in publishing the celebrity fakes crossed a line in her zealousness to attract that attention.

I do not understand the desire to leap to a bad-faith read of an article in an FPP when there's information contradicting that read right there.
posted by biogeo at 8:09 AM on June 30, 2019 [1 favorite]


Obviously, since I was the one who leaped to a bad-faith read, I think it's understandable: As a human being one of the things I am good at is pattern matching, even though sometimes there are false positives. I am also very angry.

But in any case, I didn't think that the images were chosen/provided for by the author since they are so often aren't, and the editor addressed in the Twitter thread was male. If Samantha Cole did provide the images--well, women aren't immune, and anyone who saw it should have said something.
posted by Kutsuwamushi at 8:39 AM on June 30, 2019 [2 favorites]


it strikes me that it actually makes revenge porn and similar much less harmful.

this is so naive and alogical I wonder what you even think the chain of reasoning is. You are saying that a fake nude image of anyone can be produced without their consent, and therefore nobody will be able to assume the person depicted was originally -- what? Complicit? Guilty? Deserving? the way they would have been if the porn was "real"? and that since misogyny is based on rigorous unemotional processes of deduction and is implicitly fair-minded, nobody will blame or degrade a woman for being visually associated with something that's got nothing to do with her?

or let me try it this way: what do you understand the harm done to women by "real" revenge porn to be?

right now, in the real world, with no technological assistance or apps, we can call any woman a slut. even if she's a virgin and has never had sex with anyone. The slur is entirely decoupled from any objective definition, and everyone knows it, especially people who use it. Is it therefore less harmful because there's no way to know whether it's 'true'? Because the potential to use an easy weapon like that on any woman at any time, and having her know that there's nothing she can do about it, has always been part of the point and part of the reason it's effective as intimidation. hate speech doesn't lose its power because it's not 'true'.

the argument that saturating the market with fakes makes it impossible to humiliate women with them, because we'll all know they might not be real, relies on believing that the harm from real images is in some way deserved, in some way a natural consequence of having allowed the image to be made. as if it is specially not your fault to have a fake made, in a meaningfully different way from how it isn't your fault to have a real picture distributed, and everyone will know it and make allowances. how's that, exactly?
posted by queenofbithynia at 8:39 AM on June 30, 2019 [19 favorites]


We already know women lose their jobs over nude / scantily clad images.

High-school employee fired for porn career vows to fight Quebec school board [City News - April 8th 2011]

CP Rail fires conductor again, this time after revealing social media pictures and posts [CBC News - January 23rd 2018]

Now even if you abstain from taking photographs your current (or future) employer might object to you still aren't safe. Imagine losing your job because someone faked you up a porn career side-hustle*. I can't believe people are being dismissive of this because they think there's no "physical harm" done.

* obviously it would be preferable if no one was shamed and fired for nude photographs, consensual or not.
posted by Secret Sparrow at 9:56 AM on June 30, 2019 [9 favorites]


the I-thought-deeply-about-this-for-as-long-as-it-takes-to-click-reply dudes

I thought this was interesting. I suspect I'm not the only person on Metafilter who has worked in the computer business for decades. The development of "AI X-Ray Specs" isn't some kind of amazing surprise: the notion of using computers to automagically render naked pictures of people has been around for a long, long time. Walker Percy's novel The Thanatos Syndrome, published in 1987, has a character saying "Those could have been generated on a computer!" when presented with embarrassing photographic evidence[1]. The only real news in the vice.com article is essentially "okay, it finally happened." That some people in this thread seem surprised by this development is, to me, quite surprising.

----
[1] I'm positive you can find earlier references if you dig into science fiction, but I'm pointing out Percy's book because it was mainstream and (I believe) even made the NYTimes bestsellers list when it was published.
posted by doctor tough love at 10:59 AM on June 30, 2019


Maybe don't mistake anger for surprise.

Seriously, I think you need to take a step back and consider how you're participating in this thread. First you think it's appropriate to pontificate about the "real" reason we're concerned - which of course is much more trivial than our actual reasons. Now you think it's appropriate to say we should have seen it coming, like we didn't. It's coming across really badly, because not only are you actually ignorant about important aspects of the problem, you're behaving like you're more knowledgeable about the problem than us. It is not the worst example that I've ever encountered, but it sure ... hmmm ... matches a pattern.

(sci-fi written before the advent of social media is automatically made less relevant)
posted by Kutsuwamushi at 11:57 AM on June 30, 2019 [16 favorites]


Why do you want to talk about goddamn benefit out of this? It's ghoulish.
posted by agregoli at 1:00 PM on June 30, 2019 [4 favorites]


Benefit in the sense that squad is using it feels like a positive term, nearer to "help" or some other mitigation of harm than "profit".

However, to disagree gently and partly on that point, I do think that being able to know/say/prove that a depiction of oneself is fake is mostly helpful when it comes to the public image, and although that is where most of the material consequences do come from (as reflected in the laws, which do more to protect the public self, and especially the commercialized public self) it is a secondary harm. The first violation is to one's own concept of self, and a lack of a sense of fault or guilt is sometimes helpful and sometimes very much not, as it can increase feelings of being fundamentally helpless.
posted by notquitemaryann at 2:09 PM on June 30, 2019 [1 favorite]


There is no fucking benefit, are you kidding me? Your "logic," if you can call it that, is that when sexual victimization via faked revenge porn becomes common place it will decrease in effectiveness as an attack? What the fuck? As though a) all those women who got attacked in the meantime, well, that's the just some broken eggs, and b) as though the real issue is whether or not these images are widely believed to be real, and not the psychological harm that is inflicted on the victim? As though there will be some world, some day, after enough of us have been stalked, humiliated, and victimized, when every time someone sees one they'll go "oh, I know it's probably not REALLY them," as though that makes a fucking difference?

Really?

Do you think it makes a difference that most young men are known to make shit up about the women they sleep with? You think the problem, when they spread sexual rumors about a woman, is that everyone believes it be absolutely true? It fucking isn't. It's that it's a dog whistle to target that woman. It's a target placed on her back that says "this one, you can hurt." It's that it makes it socially acceptable to feel entitlement to and ownership over her body, which other like minded pieces of shit then take full advantage of.

Which is what this shit will do to all of us.

How many fucking times do we have to tell the men in this thread that they are the problem before it sinks in? Is there a number, do you think?
posted by schadenfrau at 3:09 PM on June 30, 2019 [11 favorites]


You're also more likely to survive a gunshot wound in America than other countries because hospitals there are more experienced in treating gunshot wounds but that surely can't be claimed to be a benefit of allowing more people to be shot in any gun control debate...
posted by xdvesper at 4:04 PM on June 30, 2019 [5 favorites]


A lot of places now have laws against "revenge porn". It seems likely these laws will be expanded to include fakes. Here's one example (that goes into effect today coincidentally).
posted by L.P. Hatecraft at 4:39 PM on June 30, 2019 [1 favorite]


I think that this harm is mitigated, if not actually completely negated, by the ability of someone to say something like "that's not me."

Maybe in The Contender, but not real life.
posted by fluttering hellfire at 5:13 PM on June 30, 2019 [1 favorite]


Mod note: Comment and a couple replies removed. There's room in the world for academic & enthusiast discussions about image manipulation tech but this is a "read the room" kind of situation where just carrying on with that without recognizing the content of the rest of the conversation happening in the mean time isn't gonna feel like very thoughtful engagement with the thread.
posted by cortex (staff) at 7:15 PM on June 30, 2019 [2 favorites]


Ifdssn9, I admit that I also briefly thought maybe there'd be a benefit. Then I realized, we're assuming a best case scenario where porn produced of women without their consent is so common everyone can safely assume it's without their consent, and therefore... They'll be spared some social ramifications? But the initial condition presupposes a society so misogynistic and hateful towards women that there's absolutely no reason to believe the consequence will be "assume all porn is fake". Knowing men and the ways they hate us, it's more likely having porn faked of you will get the exact same social consequences and the fact that it's fake will be irrelevant. You probably did something to deserve it, you slut. Just like when assault doesn't make you less damaged goods for not having been consensual. And then you go back to that first sentence. And you realize we're assuming a best case scenario of women's exploitation being so commonplace as to be dull.

So, yeah, it was a nice three second "silver lining" until I thought about it.
posted by Cozybee at 10:02 PM on June 30, 2019 [8 favorites]


Please tell me the software nopes out when you point it at a kid.
posted by whuppy at 6:30 AM on July 1, 2019 [1 favorite]


Please tell me the software nopes out when you point it at a kid.

I assume that's one of the use cases "Alberto" created it for actually. There's profit to be had there for scum, so why stop?
posted by Lentrohamsanin at 7:06 AM on July 1, 2019


We are not talking about encryption.

A closer analogy would be to facial recognition systems. Similar underlying technology, same difficulty controlling it since it does little good to ban one piece of software when the problem is a whole class of mathematical techniques that are accessible to anyone, similar authoritarian impulses involved in wanting to simply ban it, and most importantly, lots of people seeing it as a clear and imminent threat that's bound to have disastrous consequences even though it hasn't done much harm yet, while most of the world will continue to be content to go on ignoring it until it's too late.
posted by sfenders at 7:15 AM on July 1, 2019


How many fucking times do we have to tell the men in this thread that they are the problem before it sinks in? Is there a number, do you think?

I'm not going to speak for ifds,sn9 but I think it's really disturbing that you're assuming that they're male just because they don't agree with you entirely. My wife left Metafilter because she was treated exactly this way. Sometimes people can hold nuanced opinions that differ from yours without being a "problem," and sometimes women or nonbinary people can hold feminist viewpoints that disagree with other feminist viewpoints.

I almost commented upthread something about how disturbing the trend is of assuming that women (or nonbinary or not explicitly identified people) whose version of feminism doesn't perfectly match one's own are actually men, but after typing and retyping an attempt I realized there was no way I could write it without getting personally attacked for it, so I gave up. After seeing the exact same dynamic play out targeting ifds,sn9, I wish I had a thicker skin.
posted by biogeo at 7:44 AM on July 1, 2019 [6 favorites]


Yeah, I've been eyeing the comments this thread with some annoyance; I'm female, and I don't see how in law this is going to be viewed as any different than the existing old boring head is pasted on to porn shot version. That doesn't mean it isn't creepy, but I'd like to discuss it at tones less than top volume.

The insistence than everyone on the thread that wants to do anything but OutrageFilter screaming must be male is... incorrect.
posted by tavella at 9:05 AM on July 1, 2019 [1 favorite]


I don't think anyone is doing any "OutrageFilter" screaming. Putting down one subset of responses as "screaming" and "top volume" is awfully weird to me, we all have the same text-based way of commenting.

I suspect, but can't confirm, that the differences in what discussion someone wants to have is how closely you or someone you know has been affected by something like this, regardless of gender. Is it merely "creepy"? Do we want to have a conversation analyzing how the software works or what laws cover it? Personally, I find a lot of those comments oblivious and minimizing. I appreciate all the work that's being done to talk about how laws that exist now have failed to protect victims; the push back on the notion that anyone in this thread is somehow surprised by this app because we're naive; and discussion about whether there's a benefit of this leading to revenge porn not being as effective over questions on what is "real" (which I don't think it will).
posted by the thorn bushes have roses at 2:44 PM on July 1, 2019 [7 favorites]


Are Technica: Deepfake revenge porn distribution now a crime in Virginia. They added "falsely created images" to an existing revenge porn law. The article also has a roundup of other efforts to curb realistic fakes.

Seems like focusing on distribution of damaging content is more likely to be effective and not have as many knock-on effects than banning the tools.

There is the "though shall not '-ster'" precedent though: Grokster lost to M.G. M. because while they claimed to be a neutral distribution platform, they were actually encouraging copyright infringement (and were named to evoke Napster). It seems unworkable to ban image processing neural nets in general, but DeepNudes training and marketing shows an intent to aid fraudsters.

Classes in beating lie "detectors" and a clean-urine-dispenser to beat drug tests have been stopped, again due to intent to abet fraud.
posted by ASCII Costanza head at 3:18 PM on July 1, 2019 [3 favorites]


I'm female, and I don't see how in law this is going to be viewed as any different than the existing old boring head is pasted on to porn shot version. That doesn't mean it isn't creepy, but I'd like to discuss it at tones less than top volume.
I don’t think this is radically different so much as scaled up: it’s easy for a much larger number of people to create, it requires much less skill to make something which requires care to detect, and it’ll be video, which most people find more convincing. Photoshop requires skill and attention to detail to make convincing fakes and video is even harder.

The existing laws haven’t had a great history, mostly due to reluctant enforcement and challenges showing concrete financial harm. The porn aspect also tends to color things but it’s broader than that: think about all of the bullying stories we’ve heard over the years and imagine what’ll happen when kids are expected to deal with an onslaught of fairly convincing imagery of them being gay, atheist/Muslim, drunk, slutty, druggy, etc. or when the RAT guys start telling teenagers what they need to do to avoid copies being sent to their conservative parents. None of that is new but it’s going to go from a trickle of cases which aren’t handled well to a flood unless the legal system starts stepping up to the challenge. It’s not hyperbole to say people will die or have severe life impacts because that already happens — it’ll just be happening to more people and for smaller things since it won’t require the attacker to invest as much effort.
posted by adamsc at 7:35 PM on July 1, 2019 [3 favorites]


I don’t think this is radically different so much as scaled up: it’s easy for a much larger number of people to create, it requires much less skill to make something which requires care to detect, and it’ll be video, which most people find more convincing...
It’s not hyperbole to say people will die or have severe life impacts because that already happens — it’ll just be happening to more people and for smaller things since it won’t require the attacker to invest as much effort.


This is my main concern. Yes, there have always been people capable of making fakes but if you wanted them to be convincing, you generally had to invest time and possibly cash on developing the skills. This will remove that barrier, allowing convincing fakes to be generated en masse by people with few to no skills, to be viewed by the full gamut of people - from those who understand how sophisticated the technology is, through those who believe there's no smoke without fire, to those who want to believe that the female supervisor/teacher/student/next door neighbour/critic/blogger/lawyer/reporter/scientist/politician/childcare worker/whatever is a nasty slut who deserves to have the entire world know about it.

I appreciate that you can say "that's not me", but I can easily think of many situations where horrific and lasting harm may result from this whether or not you can plausibly deny it.

Or as the editor of the Briefing put it, "Ahh, I think the world is too ready for DeepNudes, mate".
posted by andraste at 9:11 PM on July 1, 2019 [3 favorites]


internet fraud detective squad, station number 9: your comments were not what I was referencing, and I apologize if you feel like I was asking you or anyone else for some kind of weaponized misogyny victim credentials before commenting. I am not. There are comments above about how this isn't violence and calling it so minimizes actual violence, some chatting about how easy it is to Photoshop nudes, some other chatty comments thankfully removed — do you find those supportive of this conversation? Because I really don't. And I don't think pushing back against those comments is "screaming" or "OutrageFilter" like tavella does.
posted by the thorn bushes have roses at 10:15 PM on July 1, 2019


Mod note: It will help if everyone takes a moment to consider their responses, and a) ask themselves if they are allowing their feelings of anger to crowd out other voices that may be bringing in aspects that can be a totally valid part of the discussion (whether you agree or disagree) without constituting an attack on anti-deepfake concerns, and on the other hand, b) if your interest in the technical aspects of such an app is diverting or sidelining the topic when it's not a post about "Hey, new tech! Let's talk about the tech!" or "this is not likely to affect me as an individual unlikely to ever be targeted, so I'm going to talk about freedom of speech (or similar) issues, which is more in line with my personal interests." We can talk about most aspects that are a part of the actual posted topic: "the most devastating use of deepfakes has always been in how they're used against women," if people exercise some care to show respect for others in the conversation. Please do not assume someone's personal identity or make the discussion personal about other commenters. Thanks.
posted by taz (staff) at 3:36 AM on July 2, 2019


I suspect, but can't confirm, that the differences in what discussion someone wants to have is how closely you or someone you know has been affected by something like this, regardless of gender.

Yes. In both directions, I think. This was not a good thread to engage with, as a victim, both because of the minimization on the one hand, and because of the vehemence on the other. Fear is fucking loud and often these days it leaves no room for the quiet that sometimes comes after the worst happens. I had to leave, but who cares. People can engage as they like. I do appreciate the moderation that helped remove some of it.
posted by notquitemaryann at 3:11 PM on July 8, 2019 [1 favorite]


« Older Your momma's a pleb!   |   Sic transit gloria mundi Newer »


This thread has been archived and is closed to new comments