AI-Generated Fake/Revenge Porn is Becoming Increasingly Easy to Do
January 26, 2018 11:59 AM   Subscribe

This VICE article outlines a scary scenario: being able to easily take video footage of a celebrity, and attach their head to a pornstar's body in a porn video clip. "An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences."
posted by Quiplash (160 comments total) 26 users marked this as a favorite


 
I assume that if videos become easy to fake, we will learn to be skeptical of them as we are of images in this post-Photoshop world. Obviously that learned skepticism won't be come overnight, so there will likely be some real trickery, but do articles like this give a reason why video will be inherently different from photos?
posted by little onion at 12:15 PM on January 26, 2018 [2 favorites]


The problem is twofold. One, there is always lag between a technology being introduced and its impact being widely understood,so there will be a period where the general public will not understand that yes, videos can be doctored easily. Two, people looking to abuse this aren't looking for something that will always pass for real - just something that passes long enough to do damage.
posted by NoxAeternum at 12:23 PM on January 26, 2018 [33 favorites]


Traditionally, video has been more difficult to fake than still photos. A 10 minute video of Chechnya's governor having gay sex (link is a SFW comment on /r/deepfakes requesting that someone make such a video) would have taken 18,000 times as much effort as a single photo, and would probably be judged accordingly).
posted by jjwiseman at 12:28 PM on January 26, 2018 [2 favorites]


This makes me seethe with a furious anger usually reserved for Trump. I am inclined to start a GoFundMe to support someone making one of these porn videos with the faces of Paul Ryan and Marco Rubio, because as long as women are the victims -- even famous women -- NOTHING will happen to try to stop this. But if a few Republican men have to learn first hand how evil this shit is, so be it.
posted by pjsky at 12:28 PM on January 26, 2018 [45 favorites]


Beyond the simple question of 'will folks understand this is fake' I think there's a depth of complexity to this that has impact on the concept of privacy and body autonomy along with feeding the human-centipede/echo-effect of media.
posted by CheapB at 12:31 PM on January 26, 2018 [5 favorites]


pjsky, I'd chip in
posted by STFUDonnie at 12:32 PM on January 26, 2018 [3 favorites]


Don't worry, it will happen. The use of doctored video for smearing your political opponents is obvious and will certainly be done. It doesn't really matter if it holds up for long, just that it rockets around the world and leaves an impression in people that even a thorough debunking won't completely wipe away.
posted by Sangermaine at 12:32 PM on January 26, 2018 [2 favorites]


This is a real problem in that there is a signigant chunk of the American populace that watches 6+ hours of TV a day and thinks everything on it is real.
posted by The Whelk at 12:34 PM on January 26, 2018 [39 favorites]


My god, and I thought this week's new episode of The X-Files was on the nose about modern living being a post-truth, post-conspiracy society where truth and lies live side by side in such density that nobody can filter one from the other so in the end none of it matters and people will believe what they want regardless of the actual truth, and then I read this and feel like that episode is now out of date already.
posted by Servo5678 at 12:38 PM on January 26, 2018 [9 favorites]


Let's be clear here: the most immediate and consequential outcome of this technology is not that it "could have society-changing impacts in the ways we consume media," but rather that it will have a depressingly familiar impact - namely, as an additional tool in the objectification, harassment, and humiliation of women. It's no coincidence that it is mostly used for putting the faces of famous and powerful women on porn stars' bodies.
posted by googly at 12:40 PM on January 26, 2018 [53 favorites]


It is worth checking out /r/deepfakes (though it is generally NSFW). On the one hand, the community is 95% motivated by putting celebrity faces on porn videos, but there's a notable Nicolas Cage faction that wants to create a version of Lord of the Rings where he plays every character (examples: Cage as 60s Bond, Cage & Cage).

Posters are already talking about the idea of creating community corpora of celebrity and porn star faces. Or using their crushes' faces. Or requesting politically motivated fakes (like the Ramzan Kadyrov one I mentioned in my first comment). This sudden interest by the rest of the world seems to have caught them a little by surprise, and is leading them to at least start to think about the ethics of what they're doing (e.g., in the "How can I create a fake with my crush?" post, there was literally one comment saying maybe it wasn't a good idea, and that comment had been voted down to invisibility)--though so far most of their effort has been spent on defending the practice.

But /r/deepfakes is not the community to worry about anymore. This is so easy to do that it is definitely being weaponized by the worst people, starting as of approximately yesterday.
posted by jjwiseman at 12:41 PM on January 26, 2018 [15 favorites]


This was honestly the first thing I thought of when I saw the demo clips for the iPhone X's "animoji" feature that can map an animated face to yours. Because clearly the next step is mapping your face to someone else's, or someone else will copy yours, and then pretty soon there's video of you saying some racist shit you never actually said. (for just one example.)
posted by dnash at 12:42 PM on January 26, 2018 [6 favorites]


I am inclined to start a GoFundMe to support someone making one of these porn videos with the faces of Paul Ryan and Marco Rubio.

I mean, this would take someone a day to do. It's not even to the level of requiring funding.
posted by jjwiseman at 12:43 PM on January 26, 2018 [7 favorites]


This episode of RadioLab is very relevant here:

http://www.radiolab.org/story/breaking-news/

Simon Adler takes us down a technological rabbit hole of strangely contorted faces and words made out of thin air. And a wonderland full of computer scientists, journalists, and digital detectives forces us to rethink even the things we see with our very own eyes.
posted by evilangela at 12:44 PM on January 26, 2018 [4 favorites]


Imagine how much less the videos we need to prove things are real -- videos of black people being murdered by the police, for example -- will mean than they already do, to the many people who mostly believe what they want to believe, once this becomes something people expect.
posted by two or three cars parked under the stars at 12:45 PM on January 26, 2018 [50 favorites]


Even if you do know a video is fake, I can imagine it will still be tucked away somewhere in your subconscious. Say, you see several videos of a politician, some of which are fake. At a certain point your impressions from these videos blend together even if you know which are fake. For example if the fake video shows the politician slandering Nambia, the next time you come across something about Nambia you'll still subconsciously make that connection.

This could become the ultimate misinformation tool since we have so much history of trusting video footage of an event above other forms of media. If it was on film, it happened. There will be a serious learning curve turning around all of that ingrained momentum.

That said, I do look forward to watching post-game NBA commentary by Mr. Rogers and Mr. McFeely.
posted by hexaflexagon at 12:47 PM on January 26, 2018 [8 favorites]


Combine this with Adobe Voco voice generation software and you can do wild stuff. Replace every movie star in a movie or TV show with your favorite thespian. In the near future everyone can be George Lucas. Movie stars could have body doubles and do multiple projects at once and then add their face and voice during post-production. Movie studios can edit on-screen talent without doing actual re-shoots. If they manage to do this in real time politicians could have actors do their remote speeches for them. We're getting closer to Waldo in Black Mirror.
posted by Julianna Mckannis at 12:47 PM on January 26, 2018 [5 favorites]


Longer term, I see this as being extremely corrosive to the very concepts of truth, fact and reportage.

Apparently, Trump has already started to claim that the Access Hollywood "grab 'em" comment wasn't him. In two years, he would have a valid (but untrue) argument that it was completely faked.

The only solution I see is a hardware-level blockchain-like validation signature built into every recorder and editor that would prove that the images and audio taken in were original and untouched.
posted by Bora Horza Gobuchul at 12:51 PM on January 26, 2018 [9 favorites]


How has this not been the plot of a Black Mirror episode yet? Or has it, and I somehow missed it?
posted by pjsky at 12:52 PM on January 26, 2018 [3 favorites]


NOTHING will happen to try to stop this. But if a few Republican men have to learn first hand how evil this shit is, so be it.
And then the Republicans will... ban some math? Or ban the software that can be created by knowing that math, has been recreated from scratch multiple times within months, can be copied from one computer to another as easily as a pirated mp3, and can be used with just a few GPU-hours or a few CPU-days on any consumer machine?

It's not like that sort of thing hasn't been tried, but what makes you think it will work this time? We're looking somewhere in the middle of a difficulty continuum stretching between "ban encryption" (ha ha ha no) and "ban copyright violation" (well, kinda, I guess?) and I'm not sure anywhere along that continuum there exists an outcome we wouldn't consider to be a failure.
posted by roystgnr at 12:52 PM on January 26, 2018 [8 favorites]


How has this not been the plot of a Black Mirror episode yet? Or has it, and I somehow missed it?

It's brought up as a solution to the dilemma posed in "National Anthem" but then discarded because the PM assumes that the kidnapper will see through the fake somehow.
posted by tobascodagama at 12:59 PM on January 26, 2018 [5 favorites]


It's starting to seem like the era of video, etc. as incontrovertible evidence was a historical blip, rather than the beginning of a new trend. When everything can be faked, will we all return to a pre-industrial complete reliance on word of mouth and reputation? Or is reputation more important now anyway?

(Not like video was always incontrovertible anyway; it's not like "Project Veritas" (in quotes because it's a stupid name) needed this technology to cause real damage.)
posted by shapes that haunt the dusk at 1:02 PM on January 26, 2018 [4 favorites]


The idea that a video can be faked presupposes some concept of truth. Truth is OK in this!
posted by thelonius at 1:05 PM on January 26, 2018 [1 favorite]


When everything can be faked, will we all return to a pre-industrial complete reliance on word of mouth and reputation?

The Internet destroyed that a long time ago.
posted by Sangermaine at 1:07 PM on January 26, 2018 [4 favorites]


I suspect that in certain environments, we'll start seeing mechanisms introduced to verify the provenance of a given video. E.g., the video is signed with a hardware encryption key built into a particular camera, with the camera's location at a given date and time verified via GPS, etc. So that if you're being meticulous -- in court, or working with an insurance company, etc -- you can prove that a video is genuine.

Outside of those environments, we'll just have to accept that video is like a photo, or even a document you just claim this person wrote. Hearsay evidence at best.
posted by fencerjimmy at 1:08 PM on January 26, 2018 [9 favorites]


roystgnr -- I don't know the answer to how to effectively, fairly, realistically stop this type of crime -- and I do 100% believe it is a crime to use, without permission, someone's face to create pornography, whether it is for public or private consumption. Just like revenge porn should be prosecuted as the crime that it is, creating this type of porn is criminal. It is a horrific and despicable offense. It is a thousand times worse than the assholes who stole private photos from actresses and then sold them/shared them. My fear is that most men will shrug their shoulders and say "Oh well, what can we do" when we know damn well, if famous men were the ones having their images transferred into a porno -- something would get done.
posted by pjsky at 1:08 PM on January 26, 2018 [2 favorites]


...because the PM assumes that the kidnapper will see through the fake somehow.

(The PM gets talked into trying the ruse but kidnapper catches that a male pornstar was entering the building where the footage was to be shot and calls out the PM for faking it. )

Piece this together with how easy it is to implant fake memories and it makes the subjectiveness of reality all the more "real". Heck, of late I've been concerned how the fact that nobody knows how to turn off mirroring on their celphone cameras is creating a world of "incorrect" photos on a small scale.

Also, this technology has been around for a few years, something about porn being early adopters of new technology something something.
posted by AzraelBrown at 1:12 PM on January 26, 2018 [1 favorite]


And then the Republicans will... ban some math?

This is...willfully obtuse.

You can of course ban video based on content and criminalize it’s creation and distribution based on that same content. The obvious analog is child pornography. There’s no reason revenge porn, whose only purpose is fraud, harassment, or hurting someone, needs to be legal to create or distribute.

(And without having RTFA yet...I assume creating fictional child pornography this way would still be illegal. Jesus Christ I hope so.)
posted by schadenfrau at 1:12 PM on January 26, 2018 [11 favorites]


videos of black people being murdered by the police, for example

"That could have been anybody they were murdering!"
posted by RobotHero at 1:13 PM on January 26, 2018 [3 favorites]


This is horrifying. In the way that it's horrifying when a dark monster that one's seen lumbering slowly in the distance--and in one's dreams--for years is suddenly at one's door.

I have a serious research hypothesis that is beyond me, but seems relatively well formed: we are witnessing the vast entropic consequences of information technology, and consideration of these consequences is completely absent from its current modes of development. And I mean "entropic" in a strict, thermodynamic sense, not metaphorically. It seems to me that it should be possible to put some sort of limits on the flow of information in networks and the entropy that would have to be generated to keep a super network with massive computational power from becoming, so to speak, a Maxwell's demon, and thereby violating the second law. (A similar approach could be used to critique the notion of the Singularity's god-like AI.)

Any complexity/math nerd MeFites ever seen any papers addressing anything like that? I have done a few cursory searches over the past few years, but so far have come up empty. But then I'm more of a straight-up dynamical systems guy who only (at most) dabbles at the fringes of statistical/complex systems, so maybe I'm looking in the wrong corners of the internet.

I apologize for the abstraction. Maybe (well, almost certainly) that's part of my normal anxiety response. But it seems to me that one thing about this wave of technological development that's really coming into focus is that it's externalized costs are not only as vast and societally damaging as any classical industrial process, but also abstract as fuck. That's really, really bad. Compare that to all previous technologies: I mean, industrial energy use and it associated climate change is acknowledged to be a "wicked problem", but its mechanism is pretty simple and clearly elucidated by reductionist physical science. The new wave of AI-coupled information networks is not even that. It's much, much "wicked-er", combining the dynamics of distributed networks with society and consciousness itself, in a very intimate, hard-to-untangle way. I mean, Russian troll farms and their effect on the 2016 elections... that's just the teeniest tip of a very big iceberg.
posted by mondo dentro at 1:16 PM on January 26, 2018 [12 favorites]


I do 100% believe it is a crime to use, without permission, someone's face to create pornography, whether it is for public or private consumption

Should it be a crime to Photoshop a Dildo into someone's hands without their permission?
posted by dazed_one at 1:17 PM on January 26, 2018 [6 favorites]


Should it be a crime to Photoshop a Dildo into someone's hands without their permission?

Why yes, your orange is quite different from the apple we were discussing. You do realize that fake revenge porn will be used to ruin people's lives who did nothing to deserve it?
posted by NoxAeternum at 1:22 PM on January 26, 2018 [11 favorites]


We've had Photoshop for a long time now, and it hasn't ruined photographic evidence. The sky hasn't fallen. Most Photoshop jobs are amateurish and are easy to spot, by e.g. finding the original photo of the person who was pasted into an unrelated scene, or just by looking at the inconsistent lighting. Video fakes will be the same, at least for the foreseeable future. We trust photographs from reputable news agencies, and are somewhat more skeptical of photos from other sources. It's going to be ok!
posted by chrchr at 1:23 PM on January 26, 2018 [3 favorites]


I do 100% believe it is a crime to use, without permission, someone's face to create pornography, whether it is for public or private consumption

So how come only 38 states have laws that even mention revenge porn?
posted by EmpressCallipygos at 1:28 PM on January 26, 2018 [3 favorites]


We trust photographs from reputable news agencies, and are somewhat more skeptical of photos from other sources

Who is 'we,' though, and what counts as 'reputable,' these days? Given the current media landscape in the US, I'm not feeling super-confident that 'reputable news agencies' is currently working as a benchmark, and dropping the threshold of difficulty for faking video is probably unlikely to help with that problem
posted by halation at 1:32 PM on January 26, 2018 [2 favorites]


It's going to be ok!
posted by chrchr at 4:23 PM on January 26


Thank you chrchr for proving my point that until men are victims of this crime, it will be difficult, if not impossible to stop it.
posted by pjsky at 1:33 PM on January 26, 2018 [17 favorites]


And then the Republicans will... ban some math?

Affirm that photo-realistic video is no longer trustworthy evidence of anything.
posted by Holy Zarquon's Singing Fish at 1:34 PM on January 26, 2018 [2 favorites]


Another thing to consider:

Trump is given extreme benefit of doubt for statements where we have both audio and eyewitness evidence that yes, he said that.

Most revenge-porn victims don't have that, and can be fired at will.

Should it be a crime to Photoshop a Dildo into someone's hands without their permission?

There's multiple things judges and juries are asked to consider when judging defamatory speech. Intent, impact, and credibility all matter. So no in the case of the South Park Movie, yes in the case of revenge porn victims who had to deal with harassment and job repercussions.
posted by GenderNullPointerException at 1:37 PM on January 26, 2018 [15 favorites]


So how come only 38 states have laws that even mention revenge porn?

I'm going to go with -- Because most states have legislative bodies made up of primarily men who don't think revenge porn is a problem.
posted by pjsky at 1:40 PM on January 26, 2018 [22 favorites]


You do realize that fake revenge porn will be used to ruin people's lives who did nothing to deserve it?

Yes, revenge porn, real or fake, is a terrible thing. The problem, as I see it, is that to one person, swapping the guns in politicians' hands with dildos is a funny, apt political statement, but, with the legal measures you proposed, it could be framed as a crime in the future. How do you legally define pornography with regards to preventing fake revenge porn without potentially creating a barrier to satire or parody using digital image manipulation?
posted by dazed_one at 1:42 PM on January 26, 2018 [2 favorites]


Revenge porn is bad and should be illegal where it is not currently illegal. I don’t see that anything I said in my previous comment suggests otherwise.

Photoshopping dildos into the hands politicians is parody, not porn.
posted by chrchr at 1:50 PM on January 26, 2018 [5 favorites]


How do you legally define pornography with regards to preventing fake revenge porn without potentially creating a barrier to satire or parody using digital image manipulation?

The law makes such distinctions already. That's what laws do. Draw distinctions. I'm pretty sure that, even given your concerns, you could write something that distinguished between the satirical image you're imagining from, say, someone's kid sister being shown in a gang bang against her will. There are multiple avenues for this: consent is one; public vs. private persons is another; the nature of the act is another; some notion of degree is another.

But... I agree it's a very thorny issue. Just ignoring it isn't an option. But acting on it is fuzzy and confusing, hysteria prone, and likely marginally effective. Hence my earlier comment about the larger impact of such tech as being a wicked problem.
posted by mondo dentro at 1:52 PM on January 26, 2018 [5 favorites]


When this can be done in real time, then even live media is no longer ground truth.
posted by grobstein at 1:53 PM on January 26, 2018


Apologies - I was originally referencing pjsky's comment that stated: "I do 100% believe it is a crime to use, without permission, someone's face to create pornography, whether it is for public or private consumption".

Putting a dildo in someone's hands through photoshop could easily be construed as creating pornography with their image without their consent. I 100% agree that revenge porn should continue to be illegal.
posted by dazed_one at 1:53 PM on January 26, 2018 [1 favorite]


There was a MeFite who analayzed an photo on nytimes.com who was able to show it had been 'shopped. Will that be possible with this technology? Basically, the Age Of Trump = The End Of Truth. This is hard for me to apprehend.
posted by theora55 at 1:54 PM on January 26, 2018


> Cage & Cage

wow, that's deeply weird, since it's recognizably Andy Samberg under there but with Nic Cage's face.
posted by BungaDunga at 2:00 PM on January 26, 2018 [1 favorite]


This is of course a pre-emptive strategy to delegitimise the Trump pee tape when it's finally released.
posted by meehawl at 2:00 PM on January 26, 2018 [7 favorites]


mondo dentro: "But it seems to me that one thing about this wave of technological development that's really coming into focus is that it's externalized costs are not only as vast and societally damaging as any classical industrial process, but also abstract as fuck. That's really, really bad. Compare that to all previous technologies: I mean, industrial energy use and it associated climate change is acknowledged to be a "wicked problem", but its mechanism is pretty simple and clearly elucidated by reductionist physical science. The new wave of AI-coupled information networks is not even that."

It fundamentally rewrites the enlightenment understanding of communication. There, each idea was hand-crafted meticulously by its maker and set out into the world with love and affection. In that world, caging those ideas is obviously cruel. And how could any person object to hearing a message, when the cost of sending a message is so high?

But now that costs of message transmission have hit absolute rock-bottom, ideas aren't hand-crafted artisanal items anymore. They are like junkfood turned out from the dirtiest factory in the world. And why not, you can send out 1000 ideas a day to live in the meme ecosystem and if one of them makes it, you're a hero--all those twitter quips, explained. So treating these shit mass-produced ideas as if they should have the same respect as actual ideas is just wrong.

But of course, those hand-crafted artisan ideas still get made and distributed to family and friends, but there's just so much more information out there to confuse it. We have to have some sort of system that's resilient to both this personal communication which must be sacrosanct and uninhibited and this public communication which must be policed so that it does not hurt the culture. It is a very, very hard road and frankly if an Oracle told me that this sort of communication explosion/breakdown *is* the Great Filter, I wouldn't have a hard time believing it.
posted by TypographicalError at 2:11 PM on January 26, 2018 [16 favorites]


Putting a dildo in someone's hands through photoshop could easily be construed as creating pornography with their image without their consent. I 100% agree that revenge porn should continue to be illegal.

Here's the thing - if your response to people pointing out about how this technology can and will be used to harm vulnerable people is "but what about my free speech", I daresay you've missed the point. This is what I was talking about in this thread when I talked about the deification of free speech - that instead of considering our use of free speech as means to ends we want to accomplish, we treat it as an end in of itself and hold to it, no matter the costs. We need to be more critical in how we think about free speech.
posted by NoxAeternum at 2:12 PM on January 26, 2018 [15 favorites]


This is of course a pre-emptive strategy to delegitimise the Trump pee tape when it's finally released.

I mean, will anyone even give a shit about the pee tape anymore once we finally have video proof of Obama eating that baby?
posted by Atom Eyes at 2:18 PM on January 26, 2018 [6 favorites]


if your response to people pointing out about how this technology can and will be used to harm vulnerable people is "but what about my free speech", I daresay you've missed the point

Phew! Good thing that wasn't my response! My point was that this potential issue, which has not quite yet become reality, is going to require a very nuanced response. Blanket claims that the use of someone's image without their consent should be illegal (dependent on the definition of pornography) is tricky territory.

If it is necessary to institute further legal protections to prevent the libelous/slanderous use of someone's voice/image in digitally manipulated media and this is done without chilling effects on the use of satire and parody to hold truth to power, then I am all for it.
posted by dazed_one at 2:28 PM on January 26, 2018 [2 favorites]


video is signed with a hardware encryption key built into a particular camera, with the camera's location at a given date and time verified via GPS, etc. So that if you're being meticulous -- in court, or working with an insurance company, etc -- you can prove that a video is genuine.

Not necessarily.

If there is an encryption key embedded in hardware, attackers can extract it from the chips directly - or use a modified memory card that is not really a card, but an interface to a more powerful processor that can intercept the datastream.

Date, time and GPS values are simply EXIF metadata that can be edited/updated/removed if necessary.
posted by jkaczor at 2:31 PM on January 26, 2018 [5 favorites]


If it is necessary to institute further legal protections to prevent the libelous/slanderous use of someone's voice/image in digitally manipulated media and this is done without chilling effects on the use of satire and parody to hold truth to power, then I am all for it.

And what if there are chilling effects? Are you going to argue that the vulnerable having their lives ruined is the price for free speech? Because we've seen how that argument plays out.

I'll be blunt - I'm okay with losing the freedom to Photoshop Mr. Floppy into a politician's hands if it means that the vulnerable are protected from having their lives ruined because they got on the wrong side of someone with no compunction about harming them,because I feel that in the long run, the ends I use free speech for are best served by making sure people aren't chased out of the discussion by harassment.
posted by NoxAeternum at 2:45 PM on January 26, 2018 [8 favorites]


And what if there are chilling effects? Are you going to argue that the vulnerable having their lives ruined is the price for free speech? Because we've seen how that argument plays out.

I'll be blunt - I'm okay with losing the freedom to Photoshop Mr. Floppy into a politician's hands if it means that the vulnerable are protected from having their lives ruined because they got on the wrong side of someone with no compunction about harming them,because I feel that in the long run, the ends I use free speech for are best served by making sure people aren't chased out of the discussion by harassment.


Fair enough. I guess I feel that a balance needs to be struck; there are many cases throughout history where limiting and cracking down on the apparatus of free speech has had terrible effects on the vulnerable.

In my own experience: I'm Malaysian and journalists there, both online and paper, require permits from the government to be able to publish. I've seen offices get raided and people sent to prison when they dared speak out against the government (or even draw a silly cartoon), so I'm more in favor of nuanced responses instead of absolutist statements.
posted by dazed_one at 2:57 PM on January 26, 2018 [8 favorites]


Furthermore, people who weren't even journalists in Malaysia have faced horrible repercussions from the government for simple Facebook posts that dared to state an opinion that did not suit the people in power, so yes, I am very wary of the chilling effects that limitations on free speech can have.

This is not to say that I think all speech should be protected! I just think we should be very careful about how we institute legislation that limits that speech. Malaysia wasn't always so authoritarian - it was a gradual shift that I watched happen. I hate to see it happen in other countries.
posted by dazed_one at 3:05 PM on January 26, 2018 [8 favorites]


I wonder if the folks who advocate "uploading their consciousness" as their way to immortality are ever given pause by things like this, and what it would likely mean to such entities (ie, you'd literally be "hacked" to death).

(Is there a model anyone has proposed for "bug free" photographs or video in electronic format? Of course, no one has a model for bug-free code of any sort, so I suspect not. Are there any actual proposals? You'd need un-hackable (hah) digital "witnesses", or something akin to them. I can't see how you resolve this without huge privacy issues.

Do our devices leave a fingerprint at all, innately?)
posted by maxwelton at 3:08 PM on January 26, 2018 [1 favorite]


People have also faced horrific repercussions from the private sector for voicing their opinion, having their lives ruined by people who had no compunction with attacking them for having the temerity to be a marginalized individual speaking out.

This is why we need to think critically about free speech - yes, the government can inhibit free speech, but so can private actors as well.
posted by NoxAeternum at 3:12 PM on January 26, 2018 [3 favorites]


Comparing revenge porn to lulzy and unrealistic photoshop memes isn't a very nuanced argument. There are multiple dimensions on which we distinguish defamation and harassment from satire. Defamation and harassment are not considered to be protected speech acts.
posted by GenderNullPointerException at 3:15 PM on January 26, 2018 [7 favorites]


tbh no one will care to make this go away when men become targets of this crime. Because when it harms men people will comprehend the videos are fake and it won't alter their perception of the victims.
posted by frantumaglia at 3:23 PM on January 26, 2018 [3 favorites]


I'm not "Comparing revenge porn to lulzy and unrealistic photoshop memes". I've clearly stated that laws against libel and slander are important and that revenge porn should continue to be illegal. You've presented a very uncharitable and blatantly incorrect reading of my comments so far. Maybe you should go back and re-read what I said and what I was responding to when I said it.
posted by dazed_one at 3:28 PM on January 26, 2018 [4 favorites]


I am Not A Lawyer, but: defamation is already illegal; libel is already illegal;

Well, not exactly. Libel and slander can be the basis of a suit, but they are not criminal in the US or UK, so you can't arrest someone for it, and you need to spend money and time to get them to stop. In the US, libel/slander are really hard to prove (see wikipedia for details). Making fake porn could only qualify if you were misleading people that it was real. If the video is posted on "deepfakes," this seems like a hard case to make. If someone then took the video and misrepresented it, *maybe* they would be liable, but probably not the maker of the video.

I suspect "obscenity" could be a useful legal avenue here. If courts can convince themselves that fake porn without the consent of the depicted constitutes obscenity, it could be banned. Do any lawyers know if this could be a fruitful avenue?

Personally, I don't think such images can be banned in a way that doesn't also end up restricting legitimate expression, and I am 95% sure such restrictions will eventually be used by powerful people to silence less powerful people. But that may be a sacrifice we have to make. I mean, it totally sucks, but I'm not sure free speech as we currently know it can stay feasible with AI and social media being what they are.
posted by andrewpcone at 3:36 PM on January 26, 2018 [2 favorites]


I'll be blunt - I'm okay with losing the freedom to Photoshop Mr. Floppy into a politician's hands if it means that the vulnerable are protected from having their lives ruined because they got on the wrong side of someone with no compunction about harming them

This. I can’t even conceive of a world where not being able to photoshop penii into people’s hands is such an assault on free speech that it’s worth not outlawing faked revenge porn.
posted by corb at 3:37 PM on January 26, 2018 [6 favorites]


"Putting a dildo in someone's hands through photoshop ..."

"... swapping the guns in politicians' hands with dildos ..."

"... Photoshop a Dildo into someone's hands ..."

Your comparison stinks. You can drop it or continue to dig that hole deeper. But you did explicitly make it.
posted by GenderNullPointerException at 3:42 PM on January 26, 2018 [7 favorites]


This. I can’t even conceive of a world where not being able to photoshop penii into people’s hands is such an assault on free speech that it’s worth not outlawing faked revenge porn.

Welcome to the world of free speech absolutism, where one has to work with monsters and condone harassment in the name of freedom.
posted by NoxAeternum at 3:43 PM on January 26, 2018 [3 favorites]


And then the Republicans will... ban some math?

"You're going to outlaw murder? What, by banning some physics?"
posted by straight at 3:45 PM on January 26, 2018 [14 favorites]



This. I can’t even conceive of a world where not being able to photoshop penii into people’s hands is such an assault on free speech that it’s worth not outlawing faked revenge porn.


I can. The underlying reasoning is "people should be stopped from making this because it is untrue, and it is degrading to me." Ideas of what is similarly degrading can shift, and can be shifted through propaganda campaigns, and that can be selectively interpreted and enforced to target political enemies. The less machinery the government has for declaring content to be illegal, and the less accustomed we all are to the government banning images, the better margin we have from totalitarian control.

To be clear: I do think such fakes, like revenge porn, need to be legally restricted. I believe the harms to those photoshopped outweigh the slippery-slope-to-Stalinism argument. BUT the latter is not completely insane, and treating it like it's just Not A Thing is really careless, in my view.
posted by andrewpcone at 3:49 PM on January 26, 2018 [10 favorites]


In honor of this article I’ve published a shirt SF story kind of about this technology that I’ve been trying to sell since 2015 “THE ESTATE OF WILLIAM BRADLEY PITT VS. THE MUSEUM OF NEW REALITY”
posted by The Whelk at 3:54 PM on January 26, 2018 [5 favorites]


Ideas of what is similarly degrading can shift, and can be shifted through propaganda campaigns, and that can be selectively interpreted and enforced to target political enemies. The less machinery the government has for declaring content to be illegal, and the less accustomed we all are to the government banning images, the better margin we have from totalitarian control.

This is almost exactly what happened over a span of about 50 years in Malaysia.
posted by dazed_one at 3:56 PM on January 26, 2018 [4 favorites]


To be clear: I do think such fakes, like revenge porn, need to be legally restricted. I believe the harms to those photoshopped outweigh the slippery-slope-to-Stalinism argument. BUT the latter is not completely insane, and treating it like it's just Not A Thing is really careless, in my view.

Yeah, it is an insane argument, as Matthew Prince demonstrated when he used it as rationale for why he had to do business with the likes of 8chan and the Daily Stormer. Furthermore, it is too often wielded in a way meant to short circuit arguments that point out that real, genuine harm is being done to people by the weaponization of free speech.

If you want people to respect free speech, don't make excuses when it's used to harm people.
posted by NoxAeternum at 4:03 PM on January 26, 2018 [4 favorites]


One of the dimensions on which we distinguish sexual harassment legally is that it's a form of sex discrimination with the effect and likely intent of creating a hostile environment, undermining opportunities for merit-based promotion, and, ultimately, pushing the targets of that harassment out of jobs or education. So sure, if you're photo-shopping sex toys onto a picture of your co-workers and passing them around to peers, you probably don't have a satire or public-interest defense there.

The underlying reasoning isn't, "it is untrue, and it is degrading to me." The underlying reasoning is that revenge porn, whether actual or simulated, is a form of harassment and relationship abuse against private individuals with the effects and likely intent of inciting additional harassment, possible loss of employment, and other negative consequences. Minors, in a miscarriage of justice, can also be prosecuted for taking photos of themselves.

The slippery slope here depends on the notion that only one dimension matters, digital manipulation of the image. (Which is a non-starter because unless you're working retro photography, it's all going to be digitally manipulated to different degrees.) That's ignoring the half-dozen other dimensions that we use to make distinctions between political speech, harassment, and defamation that we use all the time, including in evaluating the text on this page.
posted by GenderNullPointerException at 4:08 PM on January 26, 2018 [7 favorites]


-Yes, revenge porn, real or fake, is a terrible thing.

-I 100% agree that revenge porn should continue to be illegal.

-This is not to say that I think all speech should be protected! I just think we should be very careful about how we institute legislation that limits that speech.

-I've clearly stated that laws against libel and slander are important and that revenge porn should continue to be illegal.


These are all taken from comments I have made in this thread. Corb, can you please show me where I said "it’s worth not outlawing faked revenge porn."
posted by dazed_one at 4:10 PM on January 26, 2018 [1 favorite]


A) Bojack Horseman beat Black Mirror to using this tech in a storyline

B) In addition to libel/slander, it's possible to bring civil suit in many states for using someone's name/likeness without permission. In some states the standard is "for commercial gain" and in other states the standard is "for personal gain."

C) I read the /r/deepfakes thread where the a-hole wanted to use his crush's face (who he claims to love) in the faker app. He flipped right into rationalization/denial/no big deal mode as soon as someone suggested that this was an awful, hurtful thing to do to a woman.

There were a couple above 0 "don't do this" comments when I looked, but the majority of the "here's how" comments were very highly up-voted. We're past the point of putting this tech genie back in the bottle. We need better men and we need to hold all of these types responsible at every social level. I'm not holding my breath on it. A tiny $%&ing start would be for reddit to stop empowering/hosting hate crime subreddits once and for all.
posted by Skwirl at 4:14 PM on January 26, 2018 [12 favorites]


This whole thing about trusted cameras that sign their videos - seems like it could be defeated, no matter how tamper-proof the camera, by just working out a setup that properly projects the generated video into the trusted camera lens.
posted by save alive nothing that breatheth at 4:15 PM on January 26, 2018 [1 favorite]


It's the line I called you out on:

If it is necessary to institute further legal protections to prevent the libelous/slanderous use of someone's voice/image in digitally manipulated media and this is done without chilling effects on the use of satire and parody to hold truth to power, then I am all for it.

You said that you were all for outlawing revenge porn - as long as you weren't stopped from photoshopping sex toys into the hands of politicians. Which is why I made my statement about what happens if there was a chilling effect - because it seems like then you do have a problem.
posted by NoxAeternum at 4:16 PM on January 26, 2018


Might need to work on your reading comprehension there. I said "If it is necessary to institute further legal protections". Revenge porn is already illegal where I am, and this is a good thing.
posted by dazed_one at 4:20 PM on January 26, 2018


I'll go even further. Hand-drawn images can be tools for harassment. Images that were taken with consent can be used for harassment. Fiction can be used for harassment. The key dimensions there are the probable effect and/or intent of discrimination, and the lack of a public interest.

Focusing on the content and medium isolated from how that work is used when it's published or distributed is denying a key part of the picture.
posted by GenderNullPointerException at 4:29 PM on January 26, 2018 [5 favorites]


revenge porn is not illegal in all jurisdictions, and things put on the internet have a poor tendency to respect said jurisdictions' borders, so...


Come on, clearly dazed_one supports making revenge porn illegal, presumably including in those jurisdictions where it does not. The "further" in dazed_one's comments clearly refers to "further than those prohibitions that I have already endorsed," not "in additional jurisdictions where prohibitions I have endorsed are not in effect."

NoxAeternum and GenderNullPointerException, I get the sense you are conflating legitimate concerns over political freedom, such as those expressed by dazed_ones, with free speech fundamentalism. Yes, harassment needs to be illegal: dazed_one is not disputing that. Their concern is about how exactly we make those rules, and that is a legitimiate issue.

I'll go even further. Hand-drawn images can be tools for harassment. Images that were taken with consent can be used for harassment. Fiction can be used for harassment. The key dimensions there are the probable effect and/or intent of discrimination, and the lack of a public interest.


That is completely insane. "Discrimination" has never been the basis of an exception to the first amendment. The reason racist speech at work is not protected is not because of special constitutional consideration against "discrimination"--it is because speech made in a work context is not protected speech. For a speech act to be "harassment" such that it loses freedom of speech protection, there are tough legal standards that have to be met (see for example Snyder v Phelps), and that is a damn good thing. The fact that a speech act is not in the "public interest" is not a basis for its not being protected speech, and I can not imagine a society in which this reasoning is accepted that has anything resembling a free press.

Seriously people, the fact that Nazis cry "free speech" does not mean that any objection to left-leaning restrictions on expression is inherently Nazi-like. Free speech is a good thing, and being protective of it is a good thing, and if we let the right own that, we will lose a lot of moral high ground.
posted by andrewpcone at 4:42 PM on January 26, 2018 [8 favorites]


NoxAeternum and GenderNullPointerException, I get the sense you are conflating legitimate concerns over political freedom, such as those expressed by dazed_ones, with free speech fundamentalism. Yes, harassment needs to be illegal: dazed_one is not disputing that. Their concern is about how exactly we make those rules, and that is a legitimiate issue.

This has always been a discussion about harassment. How do photoshopped dongs inform a discussion about harassment?

That is completely insane. "Discrimination" has never been the basis of an exception to the first amendment. The reason racist speech at work is not protected is not because of special constitutional consideration against "discrimination"--it is because speech made in a work context is not protected speech.

The whole point of sexual harassment as part of the civil rights acts is that hostile environment harassment is a form of workplace discrimination.
posted by GenderNullPointerException at 4:50 PM on January 26, 2018 [3 favorites]


The whole point of sexual harassment as part of the civil rights acts is that hostile environment harassment is a form of workplace discrimination.


Yes, and the reason that the First Amendment is not a defense for someone accused of workplace discrimination is that speech made in a work context is not protected speech. Discriminatory speech is not constitutionally different from non-discriminatory speech.
posted by andrewpcone at 4:59 PM on January 26, 2018 [2 favorites]


Let’s say that a company exists that will accept payment in cryptocurrency, and they’ll rent supercomputing in the cloud to produce bespoke, instant revenge porn in a matter of minutes, at astronomical cost. They’ll be hosted on the dark web, and nobody will know who operates the company, or how much money they make. But they’ll make a lot of money, and use it to interfere in elections around the planet. Many employees of this will be self taught from the /r/deepfakes thread. It will have something to do with Wikileaks and 4chan.

Who’s ready for the midterm elections?
posted by oceanjesse at 5:00 PM on January 26, 2018 [2 favorites]


This has always been a discussion about harassment. How do photoshopped dongs inform a discussion about harassment?

Huh? I'm not saying they do, and no one has posted a photoshopped dong on this thread, nor has anyone stated or implied that photoshopped dongs or anything close to them inform discussions of harassment.

Photoshopped dongs may be (and often are) used for non harassing satire. Showing prominent people, like politicians, in comedically demeaning illustrations is a time honored tradition, of which I am personally a fan. Would the public sphere suffer irreparably if we banned them? I'm not sure, but I lean toward "yes slightly."
posted by andrewpcone at 5:05 PM on January 26, 2018


You know that creepy feeling you get when someone is behind you reading over your shoulder?

I have that creepy feeling right now, but it's a bunch of the people of the future reading this thread and thinking "Oh, yeah. This was pretty much the beginning of the end, and you can tell MeFi is starting to realize it."
posted by kinsey at 5:20 PM on January 26, 2018 [7 favorites]


Photoshopped dongs may be (and often are) used for non harassing satire. Showing prominent people, like politicians, in comedically demeaning illustrations is a time honored tradition, of which I am personally a fan. Would the public sphere suffer irreparably if we banned them? I'm not sure, but I lean toward "yes slightly."

Except that the ban isn't being proposed just to ban photoshops of sex toys. It's being proposed to ban harassment of marginalized people (often women in this case) who have false revenge porn images made of them in order to harass them and force them out of the societal conversation. So whatever harm is done to the social discourse because people can no longer put Mr. Floppy in some politician's hand must be counterbalanced by the boost said discourse now receives because people who have been pushed out are now free to return.

The fact that you repeatedly avoid doing that balancing is a long running issue when discussing this sort of matter, because the benefits of a group no longer being harassed and thus able to join in the discourse again is routinely dismissed.
posted by NoxAeternum at 5:21 PM on January 26, 2018 [7 favorites]


A dildo isn't pornographic until and unless it's being used to stimulate someone*. If creating a video of a politician where the gun in their hand has been replaced with a dildo is considered pornographic, then the person who took the original footage with the gun shot a sniff film.

* Though I'll allow that as anatomical fidelity of said dildo approaches 1, its very presence may be considered stimulating on a per-person basis.
posted by MarchHare at 5:31 PM on January 26, 2018


The suggestion on the table is that digitally composited revenge porn be considered equivalent to other forms of revenge porn.

But, I'm getting a bit tired of people using Photoshopped dongs as a test case for political speech, denying that's a suggested test case, and then using it as a test case again. So have at it.
posted by GenderNullPointerException at 5:36 PM on January 26, 2018 [4 favorites]


Guys, while all y'all are having an intellectual argument about whether a photoshopped dildo is or is not free speech, there are guys in their basements digitizing their exgirlfriends' heads onto amateur porn videos and mailing them to their new bosses for revenge.

Could we maybe take care of the bigger fish right now?
posted by EmpressCallipygos at 5:38 PM on January 26, 2018 [18 favorites]


This whole thing about trusted cameras that sign their videos - seems like it could be defeated, no matter how tamper-proof the camera, by just working out a setup that properly projects the generated video into the trusted camera lens.
I'm sure there will be an arms race with countermeasures — multiple lenses with different angles, measuring the scene in infrared like face recognition systems, etc. — but I think it'll hit a common theme that security isn't about one part as much as the whole system. The thing which would make cameras truly trusted would be having many of them, preferably run by different people, so the overlapping views would make spoofing much harder and doing things like publishing the digital signatures somewhere widely replicated where it can quickly be signed by other parties[1], with each step adding more details which an attacker has to subvert perfectly and very quickly to avoid being detected.

The big problem with this which is that many of the most damaging scenarios don't change at all: it could help avoid someone being framed for a crime in a public location but if you share video allegedly from someone's bedroom, nobody expects there to be a hardened camera setup there and few people would even think to check whether the file was signed at all — and it's depressingly likely that many people would believe it even if it had a warning as long as it was sufficiently salacious or damaging to the other side.

1. Blockchain without the economic fantasies, basically
posted by adamsc at 5:56 PM on January 26, 2018 [1 favorite]


Pics, AND it didn't happen.
posted by jenfullmoon at 5:58 PM on January 26, 2018 [12 favorites]


various religious groups have banned the depiction of humans in images, and i’m despairing enough that it seems like an appealing idea.
posted by vogon_poet at 7:07 PM on January 26, 2018 [4 favorites]


Sorry, I've just come back from feeding the babies to say that I agree with NoxAeternum's comment above mine, which I sadly didn't see before I hit post. On this side of the screen we'd been talking about how terrifying the seeming normalisation of US politicians brandishing firearms as a campaign tactic appears to us in Australia, and the absurdity that using tech to photoshop those images into anything else could be *more* offensive, and I'll admit to having skimmed the thread. I agree that there are much bigger issues with horrifying personal implications for people to be balanced out here (and hell, Australians don't have protected free speech in the first place), so I'd just like to apologize for any distraction from that my comment caused, and am going to nap while the children are sleeping.
posted by MarchHare at 7:09 PM on January 26, 2018 [2 favorites]


We trust photographs from reputable news agencies,
Who is 'we,' though, and what counts as 'reputable,' these days? Given the current media landscape in the US, I'm not feeling super-confident that 'reputable news agencies' is currently working as a benchmark,


I also ponder who this we thing is.

Because I remember Jon Stewart's The Daily Show having more than one segment showing news orgs pretending to be remote/separated and yet were next to each other.

The Milwaukee cop shooting and selective broadcast of a call for peace VS the statement being be peaceful in our neighborhood, go burn down their more expensive stuff is news acting as a filter.
posted by rough ashlar at 7:44 PM on January 26, 2018


Free speech is a good thing, and being protective of it is a good thing, and if we let the right own that, we will lose a lot of moral high ground.

No, we lose a lot of moral high ground when we allow principles to justify harms. As I keep saying over and over, if you allow a principle to become a justification for abuse and harm, don't be surprised when people stop seeing it as legitimate. If you really want to protect free speech, don't let it be used as a shield to protect those who would harm others with it.
posted by NoxAeternum at 7:47 PM on January 26, 2018 [6 favorites]


There are still a rather large number of people who have not yet learned, and likely never will learn, to be skeptical of photoshopped or recontextualized photos -- otherwise, there wouldn't be such a need for Snopes and other fact-checkers to dismiss false (but popular!) claims.

True but I don't think there is a whole lot of overlap between this set and the set likely to download questionable celebrity porn videos.
posted by fshgrl at 8:52 PM on January 26, 2018


Seriously people, the fact that Nazis cry "free speech" does not mean that any objection to left-leaning restrictions on expression is inherently Nazi-like. Free speech is a good thing, and being protective of it is a good thing, and if we let the right own that, we will lose a lot of moral high ground.

It's sort of nauseating that this is now a position that has to earnestly argued.
posted by Sebmojo at 9:08 PM on January 26, 2018 [3 favorites]


We're getting closer to Waldo in Black Mirror.
Waldo is currently our President.
posted by vorpal bunny at 9:12 PM on January 26, 2018 [5 favorites]


revenge porn is horrible, but that may be one of the least dangerous uses for this tech. imagine activists and protesters fake videoed committing crimes to discredit their movement. politicians fake videoed giving a secret speech to some hated audience. a cycle of retribution where everybody fake videos each other til nobody knows what to think or believe with their own eyes. and libel laws won't discourage it-- those laws were designed to corral the printed lie, something that is easy to discard as hearsay and opinion. video is different. it will be a generation--a looong one--before the average person is comfortable with the post-post-post modern notion of distrusting his or her own eyes and yet still able to intelligently converse about the topic of the video without panicking that nothing is real. i wish us all luck.
posted by wibari at 9:12 PM on January 26, 2018 [11 favorites]


googly: Let's be clear here: the most immediate and consequential outcome of this technology is not that it "could have society-changing impacts in the ways we consume media," but rather that it will have a depressingly familiar impact - namely, as an additional tool in the objectification, harassment, and humiliation of women. It's no coincidence that it is mostly used for putting the faces of famous and powerful women on porn stars' bodies.

You are not wrong, but I don't imagine for a second that male celebs are going to escape this either. The ability to put two (or more) actors' faces on gay male porn bodies in motion -- and I say this with all love for my fellow slash fans, but I know them -- guarantees that slash fandom will be making movies of their favorite pairings immediately. I'm sure it will eventually have to move waaaay underground (where slash fandom lived until really really recently), but male tv and movie stars are going to find out some of what their female costars go through first-hand.
posted by tzikeh at 9:18 PM on January 26, 2018 [2 favorites]


(I put my story on Projects and some of it is about uhh the future of fandom with this)
posted by The Whelk at 9:52 PM on January 26, 2018


Sebmojo: "It's sort of nauseating that this is now a position that has to earnestly argued."

A physicist learning of Einstein's theories might well say that it's nauseating that Newton's laws have to be defended against this interloper. Free speech has historically been fine to treat as an absolute because the externalities of doing so were small. In different scales, things become different, and at the communication scale where we now operate, the externalities to make speech absolutely free are enormous, as the people in this thread are trying to tell you.
posted by TypographicalError at 11:26 PM on January 26, 2018 [3 favorites]


In different scales, things become different, and at the communication scale where we now operate, the externalities to make speech absolutely free are enormous, as the people in this thread are trying to tell you.

Literally no one on this thread has argued for free speech as an absolute. Literally everyone who has expressed a position has supported the banning of revenge porn, and no one has said that shouldn't extend to AI generated revenge porn.

What sebmojo was agreeing to was part of my post above:

Seriously people, the fact that Nazis cry "free speech" does not mean that any objection to left-leaning restrictions on expression is inherently Nazi-like. Free speech is a good thing, and being protective of it is a good thing, and if we let the right own that, we will lose a lot of moral high ground.

To be perfectly clear:
We should be protective of freedom speech. That does not mean it gets prioritized over everything else. That position is not free speech absolutism or even close.
posted by andrewpcone at 11:37 PM on January 26, 2018 [5 favorites]


the use of satire and parody to hold truth to power

I've heard this notion all my life, and I still can never quite agree with it. Satire and parody certainly holds ridicule and mockery to power, and that has often been nifty-keen, but, in my opinion, has been equally as effective (or ineffective) truthful or not. I guess I am saying what a wonderful, glorious world it would be if free speech and truth had something to do with each other.

Wonderful. Glorious.
posted by Chitownfats at 12:11 AM on January 27, 2018 [2 favorites]


Never underestimate the ability of men to make a discussion about real danger and harm towards woman somehow into a discussion of their right to do something with a phallic object (or a representation of one). Unbelievable.
posted by sockermom at 12:20 AM on January 27, 2018 [17 favorites]


We should be protective of freedom speech. That does not mean it gets prioritized over everything else. That position is not free speech absolutism or even close.

And yet whenever we have discussions about the use of free speech as a weapon to harm the weak, the discussion always winds up getting turned around to how creating protections for the people being harmed would harm free speech, as if the simple fact that an entire group of people being harassed into silence is somehow not harmful to free speech in of itself. That is a major problem with how we talk about things, because it winds up coloring the argument improperly.
posted by NoxAeternum at 7:17 AM on January 27, 2018 [8 favorites]


pjsky: " I am inclined to start a GoFundMe to support someone making one of these porn videos with the faces of Paul Ryan and Marco Rubio, because as long as women are the victims -- even famous women -- NOTHING will happen to try to stop this."

Slash is like 90% famous men and there hasn't been any serious move to ban it.
posted by Mitheral at 7:47 AM on January 27, 2018


andrewpcone: "Literally no one on this thread has argued for free speech as an absolute. Literally everyone who has expressed a position has supported the banning of revenge porn, and no one has said that shouldn't extend to AI generated revenge porn."

This is a bit disingenuous. Everyone is carving out exceptions from the ideal of pure free speech. My point is that that's a terrible place to begin your thoughts--there's nothing good about pure free speech.
posted by TypographicalError at 7:55 AM on January 27, 2018 [3 favorites]


Fake porn showing famous men having sex with women won't harm men because we don't degrade men for having heterosexual sex. Fake porn showing famous men having sex with men won't harm men because everyone will be immediately suspicious and jumping to declare it "fake news" and a product of the homosexual agenda. The problem is men are given the benefit of the doubt, while lies about women are considered obviously true and confirmation of hatred that's already there.
posted by brook horse at 8:06 AM on January 27, 2018 [12 favorites]


1. This has been an issue for decades, and in terms of the political implications none of the new technology really changes anything but the speed at which it can be produced. See: Moon Landing, JFK assassination, etc. etc. Every single example given in the article could have been faked more convincingly with a talented actor and a props/makeup/scene budget. That stupid Michael Crichton Japanophobia book from the early nineties spent some time mulling over the consequences of the problem going digital.

2. Yes, your camera has a fingerprint, and these things get used in digital forensics all the time.

3. In terms of the revenge porn implications a good chunk of the North American population seems to have lapsed into a flabbier version of the toxic masculinity cult from The Postman to the point where our best bet is to weaponize a porn/Call of Duty/football VR mind-virus to at least keep them in their fucking basements while we're over here trying to have a society.
posted by aspersioncast at 8:08 AM on January 27, 2018 [5 favorites]


So the consensus appears to be that it might be legal to develop DeepFake 4.0, and it might be legal to download it, type "deepfake comedy_movie.mkv home_video.mkv", and laugh, but making it illegal to type "deepfake dirty_movie.mkv famous_actor.mkv" will be illegal, this will be as easy to prosecute as murder, this is an adequate solution to the problem, and it is "willfully obtuse" to suggest otherwise?

In that case, I bow out. The intelligence of such a rebuttal is matched only by its politeness. The responses to my "encryption" and "copyright infringement" examples also deserve careful thought proportional to their length.
posted by roystgnr at 8:11 AM on January 27, 2018 [1 favorite]


Your Honor, we cannot accept this photograph in evidence. While it purports to show my client in a hotel bedroom with a woman not his wife, there is no way to prove the photograph is real. As we know, the craft of digital retouching has advanced to the point where a "photograph" can represent anything whatever. It could show my client in bed with Your Honor.

To be sure, digital retouching is still a somewhat expensive process. A black-and-white photograph like this, and the negative it's made from, might cost a few thousand dollars to concoct as fiction, but considering my client's social position and the financial stakes of the case, the cost of the technique is irrelevant here. If Your Honor prefers, the defense will state that this photograph is a fake, but that is not necessary. The photograph could be a fake; no one can prove it isn't; therefore it cannot be admitted in evidence.

Photography has no place in this or any other courtroom. For that matter, neither does film, videotape or audiotape in case the plaintiff plans to introduce in evidence other media susceptible to digital retouching.
— Some lawyer, any day now"


— Stewart Brand, The Media Lab: Inventing the Future at MIT, 1987
posted by bz at 8:41 AM on January 27, 2018 [5 favorites]


We all know, in the world of persuasion, it only takes one viewing to inform or disinform. I think the most likely product from this, as it makes little bits of money everywhere, is plain old porn. People will be able to pay for personalized films of anyone doing sexual things with and to them. Ruing the lives of politicians? They seem to do a good enough job of that, themselves. Ruining privacy? Privacy has become an expensive delusion. Soon enough we will be happy for our burkas, and the blanket forts where we have sex.
posted by Oyéah at 12:00 PM on January 27, 2018 [1 favorite]


Everyone else: "This technology Will be used to destroy women's lives and Might be used to basically destroy the fabric of civilization as we currently understand it, I think that's bad."

MeFi: "But like, how *detailed* is this photoshop dildo?"
posted by We put our faith in Blast Hardcheese at 4:04 PM on January 27, 2018 [6 favorites]


I'm perfectly fine with people starting from a position of everything is permitted and then carving out exceptions for things that should be forbidden. As long as everyone gets that it will be inevitable and doesn't go, "Ow! My slippery slope!" when we start carving.

I think the important distinction isn't whether it is pornographic, but more akin to libel or slander. How likely is it that people will believe it's real, and what is the damage if they do?

So the dildos will still be fine because a large part of the point is its absurdity, which makes it unbelievable.
posted by RobotHero at 6:03 PM on January 27, 2018 [5 favorites]


Wired has an article talking to Mary Anne Franks about the legal aspects.
posted by lucidium at 6:17 AM on January 28, 2018


That Wired article is great, thanks for posting it. It appears as though a firm legal solution is frustratingly far off. As articulated above, it's very likely because it's a problem that nearly exclusive to women, so people would rather go off on high minded rants about freedom of speech littered through with hypotheticals rather than do anything practical to resolve a very real, actual problem.

The article goes into this, but I think that platforms could also do intermediary things that would help: Reddit could delist the sub and ban any similar subs (creeps will congregate elsewhere, but the incidental exposure to the less than motivated will lessen), Github could delist the codebase and take proactive steps to banning replacements, Porn tube sites could take the same proactive steps to block the videos, porn studios could be vigilant about DMCAing videos that use their content as the base... the list goes on. I don't think any of these things will happen because of a confluence of chasing money and keeping creeps happy so they keep using your platforms, alongside the aforementioned high minded hypothetical defense of freedom of speech.

In terms of a long-term solution, I was curious about one's right to one's image, which appears to be covered in America under 'rights of publicity'. The Wired article goes into this a bit too, but it seems to largely cover commercial contexts (most cases that have been tried have been a celebrity suing against unauthorized use of their image in an advertisement), and is also state-by-state, with no reliable federal statute in place. Apart from that, getting back to the scummy nature of most platforms, I think that it would be lobbied heavily against by the Tech Industry, because I can see it setting precedence for face recognition capture. If I could, then I would officially block Facebook and Google from using my Face Capture on my content at any level. Google's recent classic art stunt indicates that they have this technology, and are actively looking to hone and refine it, and I assume that wherever Google goes they are followed by Amazon, Facebook, and etc.

People have articulated this very nicely above, but I think it needs to be repeated: The toxicity of Twitter, the festering of conspiracy theories on insular social sites, and the injured white male identity that have given fascism a foothold in so many online communities were readily apparent to a number of activists since 2013. That is when Gamergate blew up, but obviously these problems existed long before then. By ignoring stories of persistent harassment in the name of protecting free speech, that problem is now everyone's problem, because the GG focus on marginalized populations was sort of a testing ground for expanding out into the mainstream. In a similar way that harassment of women and minorities was a persistent problem all through this past decade, but only seemed to become a "real problem" when those same chuds turned out in force for trump. It appears that this issue is trending in the same direction - especially given the way it's been discussed on a site that is often positioned as, and holds itself up as if it were, a left leaning progressive community.
posted by codacorolla at 10:50 AM on January 28, 2018 [6 favorites]


There was also Crispin Glover's suit against Universal Pictures when they used George McFly prosthetics on a different actor in Back to the Future Part 2. Which again is a matter of his likeness being used without his permission. But how does this approach scale when these capabilities are put in the hands of larger numbers of anonymous internet people who don't have for-profit companies to make them easy targets to sue?
posted by RobotHero at 1:01 PM on January 28, 2018


So the consensus appears to be that it might be legal to develop DeepFake 4.0, and it might be legal to download it, type "deepfake comedy_movie.mkv home_video.mkv", and laugh, but making it illegal to type "deepfake dirty_movie.mkv famous_actor.mkv" will be illegal, this will be as easy to prosecute as murder, this is an adequate solution to the problem, and it is "willfully obtuse" to suggest otherwise?

We already make much finer distinctions than that with regard to, for example, whether the star of a pornographic movie was 17 or 18 at the time of filming, so I don't understand your objection here.
posted by straight at 1:11 PM on January 28, 2018 [1 favorite]


I would start by suing the platforms, and then letting the one thing that capitalism is good for (companies looking out for their own asses) take care of the rest. If you make it in a tube site or a gif hoster's best interest to be diligent about blocking content like that, then you cut off the distribution channels. There are already similar systems in place - it's why Facebook isn't overrun with Stile Project type gore and shock videos (although there definitely is a system of labor exploitation in how Facebook manages that at scale).

Another analogy could be drawn to fraud charges on a credit card. Credit cards are on the hook for the charges and not the consumer. If someone impersonates you to steal money from you, by law, credit card companies have to pay for that out of their own pocket. The reason that card services even have fraud prevention programs is because of this. Imagine a system where each individual were required to protect their own credit history.

What likely happens is that the producers get chased to shadier platforms, but you reduce the potential for harm in the mainstream platforms where most people live their lives. However, I would still call that an important first step.
posted by codacorolla at 1:18 PM on January 28, 2018 [2 favorites]


People have been gluing celebrity heads / their exes' heads / their neighbours' heads on porn jpgs since forever. I don't get what difference it makes because the jpgs start moving. I'm no more likely to completely mistrust every movie I see because this is a thing now than I mistrust all still photos now because there's Photoshop. If a picture or a movie matters - if there's something really important at stake - then there'll be a billion eyes on it and any flaws (not just in the images themselves, but in where and how they were allegedly captured) will be quickly exposed.

If I were to suspect state actors or even wealthy private citizens of doing this with movies then I'd have to have done so since 1994 when Lt Dan lost his legs and Brandon Lee came back from the dead.

I think the likelihood of anybody facing civil or criminal charges for this is about as likely as it is now for still images, which is to say not very likely at all. You'd have to do something like shoot the next Star Wars movie using Harrison Ford's face on somebody else's body without his permission.
posted by obiwanwasabi at 6:36 PM on January 28, 2018 [2 favorites]


Having people be more skeptical of digital artifacts, whether video recordings or stills, is probably—just generally speaking—not necessarily bad. Once the technology exists to tamper with recordings in a convincing way, people need to be aware that the possibility exists, and it seems like it might be more dangerous, in terms of things like election-tampering where there's deep pocketed, state-level-actor interest, for people to not be aware that it's possible.

Trying to prevent revenge porn because it's a shitty thing for people to be doing to each other is well and good, but protecting the public's innocence or lack of skepticism doesn't strike me as a legitimate end, and might do more harm than good. Some level of skepticism is definitely warranted, because there are people who are going to get access to this technology and use it.

Yes, scoundrels—including likely the one currently inhabiting the White House—will take advantage of public skepticism to undermine the legitimacy of embarrassing media; they were always going to do that. But the solution there isn't to train the public to absolute credulity, but to emphasize that convincing—to casual observation—fakes are possible, and to explain the analysis necessary to validate them. This is also what will be required in legal contexts. It will be an ongoing problem as techniques both for creating fakes and detecting them develop.

There are a variety of systems for preventing tampering to digital files; I'm personally a fan of digital timestamping as a first line of defense. It doesn't prove that an artifact hasn't been subject to manipulation, but does establish that it existed in a particular form at a particular time, which can provide good evidence as to its authenticity. (If you've validated the original DNG of a photo within a few moments of when the pictured event is widely known to have occurred, you limit the amount of tampering you plausibly could have done.) You could probably create a phone app that interfaced to a DSLR camera to transmit hashes of each frame, as it's captured, onto a blockchain-type system, just as a cocktail-napkin idea.
posted by Kadin2048 at 9:47 PM on January 28, 2018 [1 favorite]


If a picture or a movie matters - if there's something really important at stake - then there'll be a billion eyes on it and any flaws (not just in the images themselves, but in where and how they were allegedly captured) will be quickly exposed.

A revenge porn REALLY MATTERS to the woman it was created to hurt. And she won't have the help of millions of people to fight against it.
posted by agregoli at 8:01 AM on January 29, 2018 [7 favorites]


codacorolla: "Github could delist the codebase and take proactive steps to banning replacements,"

This is wild ineffectual over kill. Yes as is so often the case sex/pornography is an early adopter of technological advance (see for example the VCR or the Polaroid). But this software can also be used for all sorts of non-evil purposes. Banning the software from open source repositories just hands over the market to Adobe et. al. because I can guarantee this ability will be baked into any video editor worth the name in 10 years.
posted by Mitheral at 11:27 AM on January 29, 2018 [1 favorite]


Yes as is so often the case sex/pornography is an early adopter of technological advance (see for example the VCR or the Polaroid).

You do realize that people are concerned about harassment, not sex, right? People aren't upset about this being used for porn, they're upset about it being used to harm people.

But this software can also be used for all sorts of non-evil purposes.

And how many lives harmed is the price for those purposes? Again, people are worried about the avenues for abuse and harm this system affords. And this seems to be a reoccurring blind spot I see with a lot of techies - that somehow the ethical concerns of their creations is Someone Else's Problem - which is how we wound up with all the issues around Reddit, Twitter, the gig economy, etc.
posted by NoxAeternum at 12:06 PM on January 29, 2018 [2 favorites]


Github could delist the codebase and take proactive steps to banning replacements.

This is not difficult code to write, or even a difficult concept, given the kinds of libraries that are available today. And it will get even easier--In 6 months, this could almost be an introductory exercise in a deeplearning/ML class.

And this is like one of a hundred frightening applications that have suddenly become possible, or even easy, in the past few years.
posted by jjwiseman at 12:27 PM on January 29, 2018 [3 favorites]


There's zero way to ban the software via a top-down approach like that. That's just... it doesn't make much sense. What would be the legal framework for doing it, for starters? The DMCA? It's not like the media companies have exactly had great success controlling the spread of stuff like DeCSS that way. On a practical level, there are lots of code hosts besides GitHub, and lots of them might be willing to work with a "banned" project purely to spite the idea of someone trying to ban software. It takes very little to set one up, and then you're playing wack-a-mole with websites a la The Pirate Bay. In addition, lots of people would go out and clone that repo purely because it was banned, or might get banned; you might actually cause more copies of it to exist by trying to eliminate it.

A productive avenue might be able to achieve a voluntary shift in how the software is released by working with developers—most complex open source projects have a surprisingly small number of core developers/maintainers—and convincing them to make it a bit harder for casual users to use the product irresponsibly. A concerted influence campaign targeting a small number of key developers strikes me as a tractable plan, in a way that a top-down government or large-corporation sponsored ban doesn't.

There are parallels to the security community; most legitimate security researchers won't publish working exploit code for unremediated vulnerabilities, because it's understood that giving "script kiddies" ready-to-run tools for attacking others' systems is unwise. Sometimes the vulnerabilities will be widely discussed in the academic press or on mailing lists as fixes are implemented, such that a subject matter expert could easily create an exploit, but as long as the relatively small number of people who are capable in doing this don't share their work, the mass effect is minimized. You saw a lot of this going on with Meltdown/Spectre last month.

I think you'd have better luck asking developers not to create tools that trivially enable shittiness (and, if necessary, applying social pressure to achieve agreement) rather than trying to tell them they can't.
posted by Kadin2048 at 12:52 PM on January 29, 2018


NoxAeternum: "You do realize that people are concerned about harassment, not sex, right?"

You realize this was a proposal to ban a multipurpose tool not the product there of right?

NoxAeternum: "People aren't upset about this being used for porn, they're upset about it being used to harm people. "

And the harm is because of the porn aspect. If the go to example on /r/deepfakes was of faking ex's face-planting after tripping over a shoelace there would be a lot less angst.

NoxAeternum: "And how many lives harmed is the price for those purposes?"

You can ask the same question (and add on child pornography as the boogeyman) about practically every thing that came out of the existence of ARPANET and the personal computer. NEWS groups, email, photosharing, FTP, social media, real time traffic routing, digital photography, photoshop, Illustrator, desktop publishing, publicly available aerial photography, Youtube, Garageband, etc. ad nauseum. Asking to suppress these technologies because they can be used for harassment along side good is wishing to live in an unobtainable ARM-lite future where nothing is permitted to be invented except for the most innocuous variations on current technology.
posted by Mitheral at 12:55 PM on January 29, 2018 [2 favorites]


Ah, the joys of arguing with engineer's disease!
posted by codacorolla at 1:15 PM on January 29, 2018 [5 favorites]


And the harm is because of the porn aspect. If the go to example on /r/deepfakes was of faking ex's face-planting after tripping over a shoelace there would be a lot less angst.

This is actually an offensive argument to make, because you're implying that the real problem isn't that a video was made that harms someone, but that people need to get over their sexual hangups. Sorry, but no, you don't get to say "that doesn't count" because of how the harm occurs.

You can ask the same question (and add on child pornography as the boogeyman) about practically every thing that came out of the existence of ARPANET and the personal computer. NEWS groups, email, photosharing, FTP, social media, real time traffic routing, digital photography, photoshop, Illustrator, desktop publishing, publicly available aerial photography, Youtube, Garageband, etc. ad nauseum.

You're right - you can ask the same question of all those technologies. That's what ethics is about. Harms don't go away just because there are other non-harmful uses of some technology, as much as you'd like to pretend that they don't exist.
posted by NoxAeternum at 1:27 PM on January 29, 2018 [5 favorites]


The fundamental issue here is harassment, not medium or technique, since this is an expansion of media and techniques that have become increasingly used in movie production.

Most revenge-porn prosecution involves imagery or footage taken with equipment and techniques that are ubiquitous. And yet, so far we've resisted the temptation to argue a slippery slope that revenge-porn laws threaten silly cat photos.
posted by GenderNullPointerException at 1:47 PM on January 29, 2018 [3 favorites]


No, we usually can't ban the technology involved outright, but we can be aware of the potential avenues for abuse so that we're not blindsided when those abuses occur.
posted by GenderNullPointerException at 2:03 PM on January 29, 2018 [1 favorite]


so that we're not blindsided when those abuses occur

It's too bad people don't consume more science fiction. As soon as computer-based video editing became a thing in the 80's, there was sci-fi predicting that eventually "stars" would just license their image persona...

Some of these abuses are only being discovered thanks to the rapid nature of connectivity over the internet - while I am sure there was revenge-porn decades ago, it wasn't as easy to spread.

This is the "network effect", and we are still evolving mentally to meet it's challenges - and we are failing.

Tough-call, slowing technological progress to try and safeguard against potential future abuses - and it will never happen in our market-based economies.
posted by jkaczor at 2:39 PM on January 29, 2018


I'm not advocating for "slowing technological progress," only that we need to be prepared to make the legal argument that the act of compositing two images together in a way that passes for authentic does not magically create a legally protected work of art.
posted by GenderNullPointerException at 2:57 PM on January 29, 2018 [1 favorite]


A revenge porn REALLY MATTERS to the woman it was created to hurt. And she won't have the help of millions of people to fight against it.

Any woman (or a person of any gender, for that matter - I very carefully kept my comment gender neutral) could ask for help on Reddit proving a video was fake, and she'd / they'd get it.

Said woman / person has also been exposed to this risk through photographs for a very long time.
posted by obiwanwasabi at 3:51 PM on January 29, 2018 [1 favorite]


Any woman (or a person of any gender, for that matter - I very carefully kept my comment gender neutral) could ask for help on Reddit proving a video was fake, and she'd / they'd get it.

How fucking dense are you?
posted by codacorolla at 3:54 PM on January 29, 2018 [8 favorites]


So, to prove the porn is a fake, they just have to show it to many, many more people. And I have enough of a hard time when Snopes says something didn't happen. How many people who see the video are going to also find the Reddit thread and then believe some guy on the internet who says it's a fake because of the pixels?

I'm sympathetic to the idea of "legitimate uses" because I use DVD ripping software for fair-use purposes, for instance, but "ask for help on Reddit" is not even close to a solution here.
posted by RobotHero at 4:22 PM on January 29, 2018 [2 favorites]


NoxAeternum: "This is actually an offensive argument to make, because you're implying that the real problem isn't that a video was made that harms someone, but that people need to get over their sexual hangups."

I'm not sure I follow but I really wasn't.
posted by Mitheral at 5:00 PM on January 29, 2018


we need to be prepared to make the legal argument that the act of compositing two images together in a way that passes for authentic does not magically create a legally protected work of art.

That seems like a reasonable argument to make, and one I'd imagine a court being willing to accept.

I'd be interested in reading an examination of whether that might fall under, or be made to fall under with minimal modifications, existing forgery laws (or maybe "uttering and publishing" in the US?). It doesn't seem like a stretch, if the images are being published with the expectation that they'll be received as the real article (which would be the whole point, as "revenge porn"); that's not much different than forging a painting, the only difference is the expected misrepresentation of the creator vs. the subject.
posted by Kadin2048 at 7:36 PM on January 29, 2018 [1 favorite]


How fucking dense are you?

Not very, at all, thanks for asking. Was there a particular part of what I wrote that you'd like me to clarify? Because as I'll explain below, it's a matter of sending a few frames of the video with your face blurred. '

Do you legitimately believe this? What evidence do you have? Because given past precedent, this is not what redditors have done, ever.


Every kid who's ever posted 'sauce?' as a comment on an image or a porn video and had a reply linking to that source in thirty seconds flat.

I'm sorry that I'm harping on this, but are you a dude?

Yes, I am. Why is it important? By all means, offer counter arguments, but 'you've got a dick' isn't one, and it'd be like me saying 'Are you a woman? Because you're harping on about this.'

So, like, how do you honestly believe that a woman wanting to prove that she's been deepfaked will get help from Reddit instead of having her video sent to more people?

I think that the vast majority of deepfakes will be of celebrities, and they won't even have to ask for help. People will ask 'is this real?' People who can tell will say yes or no and how they know. People will publish guides about how to tell if a video is real or not.

And you won't need to send anybody the entire video. You send a few frames with your face blurred and people will say 'hey, I recognise that couch, it's from Buttplug Anarchy 97, here's a link'. And then you can send that link to your friends and family and say 'I bloody told you so'.

And you legitimately think people on Reddit will help a victim of revenge porn?

You know there are people on Reddit who don't like revenge porn and absolutely would help somebody who was in that situation, right? It's not all a wretched hive of scum and villainy. It's not even mostly that. Nobody there has ever asked me about how fucking dense I am, for example, or accused me of idiocy because of my gender.

"ask for help on Reddit" is not even close to a solution here.

It's better than 'let's ban this, because that worked so well with other bits of code we didn't like like DeCSS'.

To recap: this is new, but so what? It's not nearly as scary as what's already happening right now, and what will continue to happen far more often in the future. You don't need to be worried that somebody is going to get a 100,000 image data base of your face they can use to train a convincing deepfake model because they're out to get you. If they did, anybody who really gives a shit will be able to tell it's fake (that's not your couch, that's not your house, those aren't your shoes, that's not your tattoo, you've never had hair like that, that looks suspiciously like this other porn video, and so on.) You need to worry that they're going to leak the video you made together, or the pics you sent them, like this scumbag in my home town today.
posted by obiwanwasabi at 4:04 AM on January 30, 2018


(I mean, for fuck's sake - upload the entire video with your face intact? That was the first and only solution to pop into your mind when I said 'ask for help on Reddit'? And I'm fucking dense?)
posted by obiwanwasabi at 4:22 AM on January 30, 2018


I mean, for fuck's sake - upload the entire video with your face intact? That was the first and only solution to pop into your mind when I said 'ask for help on Reddit'? And I'm fucking dense?

....I didn't think so before, but I'm frankly starting to wonder, actually. Because:

1. How else would a woman going to reddit be able to ask for help, and
2. What makes you think that there wouldn't be a ton of guys pretending to be offering to help and asking "hey, unfortunately I need to see the video to be able to make an assessment, can you DM it to me" just so they can get it from her?
posted by EmpressCallipygos at 4:40 AM on January 30, 2018 [1 favorite]


Mod note: Folks, maybe cool it a bit. Obiwanwasabi, arguing that a) this really isn't a big deal, especially for famous people, and b) even if you feel like it would be a big deal if it happened to you, you could just ask reddit to prove it fake and certainly wouldn't face any bad consequences – plus c) digging in and railing at people pushing back on that suggestion and sneering because they didn't add various conditions you never mentioned... this is really not a good way to interact, and it's time to stop digging in on the "everyone else is stupid" position.
posted by taz (staff) at 4:58 AM on January 30, 2018


hoping that image/video matching works on the rest of the data even though such alterations can affect hashing and play havoc on image recognition due to differences in compression,

I think his implied plan isn't relying on computer recognition, but internet people who've seen a lot of porn.

My first interpretation had been he was implying you would rely on people who can tell by the pixels, having seen quite a few face-swaps in my time. Which would have required posting a version with the face in question unblurred.

And then you can send that link to your friends and family and say 'I bloody told you so'.

Though that's kind of assuming everyone who saw it spoke to you directly about it, rather than whisper behind your back.
posted by RobotHero at 8:01 AM on January 30, 2018 [1 favorite]


It's right that you wouldn't need to share a faked video with your face in it to locate the original, but wrong to suppose having that evidence in your possession would be an effectve remedy for very many of the harms done by the distribution if the fake.
posted by straight at 8:17 AM on January 30, 2018 [1 favorite]


So, intriguingly, there is a thread on the Reddit /r/deepfakes sub discussing whether putting a "fake" watermark on fake porn and other creations ought to be de rigueur. The response is not as hostile to the idea as I imagined it would be.

Not a solution or anything, but it's not like the people doing this stuff are as totally tone-deaf or unaware of the implications as I'd thought they'd be.
posted by Kadin2048 at 7:33 PM on January 30, 2018


The state of the art in this technology is on display in I, Tonya, the new movie.

This took a VFX team a few months to do it “really well”, so that a moviegoing audience can enjoy it on a cinema screen. If your standards are lower, it’s proportionally easier.
posted by theorique at 3:31 AM on January 31, 2018 [1 favorite]


It's also worth pointing out that they're not just sticking to real life women in putting their faces in porn, if you're wondering how deep the sexism goes here. (Link potentially NSFW.)
posted by NoxAeternum at 7:36 AM on January 31, 2018


Well, following the discussion of revenge porn, or how does Scarlet Joohansson feel about someone using her likeness in a porno, using video game characters feels downright wholesome in comparison.
posted by RobotHero at 8:24 AM on January 31, 2018


It's also worth pointing out that they're not just sticking to real life women in putting their faces in porn, if you're wondering how deep the sexism goes here. (Link potentially NSFW.)

I may just not be seeing all the angles here, but why would involving depictions of fictional women be any more deeply sexist? Particularly, the phenomenon of fully computer-generated animated porn as mentioned in the article (the existence of which makes this particular application of the deepfake stuff all the weirder) would seem to me distinctly less sexist than live-action porn, strictly in terms of the medium; since to whatever degree the production of porn is exploitative of people whose likenesses are used or exploitative of people who must have sex on camera to make it, no real person is exploited in the course of creating animated porn when the characters are fictional. (Except perhaps for voice actors?)

It's certainly possible for the themes and subtext (or explicit text) in a particular work of pornography to be more or less sexist, but that aspect seems independent of whether it's live-action, deepfake-modified live-action, or animated.
posted by XMLicious at 3:16 PM on January 31, 2018


Anyone looking for a more SFW example of a "deepfake" might be interested in this article that's been making the rounds the past few days: Family fun with deepfakes. Or how I got my wife onto the Tonight Show

At least at the resolution of the embedded GIFs, it's pretty impressive. It's not 100%—there's still a bit of an uncanny valley thing going on when you look at the clips for too long—but it's pretty good for a tool that just about anyone can use. (Which implies that people with more resources have probably had this ability for a while, and probably have better abilities currently.)
posted by Kadin2048 at 10:01 AM on February 3, 2018


Tom Scott has a good video discussing the issue and the ethics behind it.
posted by NoxAeternum at 8:49 AM on February 5, 2018


Our Hackable Political Future - "Imagine the day when operatives can create fake video of their enemies. That day is here."

Fake news: you ain't seen nothing yet - "Generating convincing audio and video of fake events."
One form of verification is to demand that recordings come with their metadata, which show when, where and how they were captured. Knowing such things makes it possible to eliminate a photograph as a fake on the basis, for example, of a mismatch with known local conditions at the time.
Researchers have figured out how to fake news video with AI - "But as the doors for new forms of fake media continue to fling open, it will ultimately be left to consumers to tread carefully."

The dangerous new technology that will make us question our basic idea of reality - "Once viewers learn that videos can be computer generated, this technology could essentially undermine the credibility of all audio and video recordings. This may lead viewers to conclude that anything they read, hear, or see online could be fake. Ultimately, this lack of trust in facts could further erode democracy."
posted by kliuless at 6:11 AM on February 6, 2018 [3 favorites]


I don't understand all these 'the sky is falling' takes on fake video technology. As long as there's been media there have been fakes. Convincing fakes in photography have been around as long as photography. In fact I'd argue that fake text-only stories are at least as pernicious as anything you can or could possibly do with fake video or audio.
posted by runcibleshaw at 7:46 AM on February 6, 2018 [1 favorite]


The /r/deepfakes subreddit has been banned. Reddit has also clarified their policy about involuntary pornography, apparently to cover this case. They explicitly disallow "depictions that have been faked."
posted by Nelson at 10:36 AM on February 7, 2018 [4 favorites]


Gfycat and PornHub have also classified deepfakes as non-consentual and are removing known ones from their services.
posted by NoxAeternum at 1:32 PM on February 7, 2018 [2 favorites]


I'm interested to see all the free speech defenders rush in to comment! Keep those hot takes coming, fellas!
posted by codacorolla at 7:02 PM on February 7, 2018 [4 favorites]


I suppose one distinction to think about is if someone starts a new subreddit and explicitly keeps the porn out, sticking just to the technical discussion of how one would make one, but without sharing the results. The Reddit policy as it stands would permit this.
posted by RobotHero at 8:12 PM on February 7, 2018


Even that article's framing is problematic. Here's the second paragraph of the piece:
The morality and the legality of deepfakes are murky issues. Swapping one person’s face onto another’s body is not inherently malicious, but the practice isn’t just used for creating digital stunt doubles. Most deepfakes are pornographic in nature, with users replacing the faces of porn stars with their favorite celebrities. But the tech also enables potential abuse for anyone who puts their face online, who could then end up appearing to star in porn against their will
Perhaps the legality is murky (but I'd bet it's less murky than people would think), but the morality of transposing someone's face onto another person's body without any consent whatsoever is pretty fucking clear from where I stand - it's pretty immoral. Hell, this became a major issue in Hollywood when it was done with prostetics that an entire rule - the Crispin Glover rule - was created specifically to deal with it by the industry.
posted by NoxAeternum at 11:51 AM on February 9, 2018 [1 favorite]


Perhaps the legality is murky (but I'd bet it's less murky than people would think), but the morality of transposing someone's face onto another person's body without any consent whatsoever is pretty fucking clear from where I stand - it's pretty immoral.

Chorus of dudes: "Maybe in some thought experiment world, it's 'immoral'. But, muh boners are in the real world and they demand attention."
posted by theorique at 12:05 PM on February 9, 2018


I made the above post before reading the Verge article, but my mocking, ironic, made-up, quote almost literally parallels the last line of the article.
"As one user wrote in a now-deleted post, “I have some philosophical qualms with [deepfakes], but it doesn’t stop me from jerking it.” "
Pandora's box is open.
posted by theorique at 12:09 PM on February 9, 2018




Episode of BBC's Click on deepfakes, and also Andy Serkis's performances. (First broadcast yesterday; that's a geo-locked link but it may show up on their Youtube account.)
posted by XMLicious at 5:44 AM on February 25, 2018


« Older Disaster Tourism - an exploration of "dark travel"   |   Queer Kid Stuff Newer »


This thread has been archived and is closed to new comments