No, It's The Users Who Are Wrong
January 10, 2020 2:05 PM   Subscribe

Several months after their announcement that they would allow political candidates to openly lie in ads, Facebook has responded to criticism of that decision with another announcement - that they will allow users to opt out of seeing political ads if they choose. (SLArs Technica)

Response to the announcement has been negative, with both Elizabeth Warren's and Joe Biden's campaigns denouncing the move.
posted by NoxAeternum (78 comments total) 17 users marked this as a favorite
 
I already more or less opted out of political ads on Facebook, by labeling as "offensive" any one I got served. The algorithm appears to have learned -- I almost never get served them anymore.
posted by jscalzi at 2:08 PM on January 10, 2020 [25 favorites]


The kind of people who will choose to opt out of political ads on Facebook were never the target audience for those ads in the first place.
posted by Two unicycles and some duct tape at 2:14 PM on January 10, 2020 [121 favorites]


Again, this is yet another "deal with toxic waste by closing the blinds" move by Silicon Valley, and it fails for the same reason - it doesn't actually solve the problem. You opting out of receiving disinformation doesn't make it disappear, doesn't make its harm go away - all it does is just let you avert your eyes from it.
posted by NoxAeternum at 2:27 PM on January 10, 2020 [24 favorites]


Thank Christ.
posted by Melismata at 2:32 PM on January 10, 2020


“Lie to me / I promise I’ll believe”
posted by panglos at 2:34 PM on January 10, 2020 [1 favorite]


all advertising is political
posted by 20 year lurk at 2:42 PM on January 10, 2020 [11 favorites]


Mod note: A few comments removed. Let's accept that some people use Facebook, some don't, some would rather not but are obliged to by circumstances, and every other variation under there, and just skip the whole "Well Just Don't Use It!" vs. "It Complicated!" argument that's happened a ton of times before and move onto discussing literally anything else about the links.
posted by cortex (staff) at 2:52 PM on January 10, 2020 [64 favorites]


The people that need to opt out of the ads (ie your grandfather) will never figure out how to do it. Facebook is counting on this.
posted by COD at 2:55 PM on January 10, 2020 [28 favorites]


Facebook takes ad revenue and sabotages democracies around the world. Their effect on public discourse is well-documented by journalists. They don't need to be allowed to make token gestures that are nothing but self-serving PR. They need to be regulated as any other media company, or dismantled in total and sold for scrap.
posted by They sucked his brains out! at 2:57 PM on January 10, 2020 [41 favorites]


Exactly, COD. Most of the people who are savvy enough to opt out (or to even know it's an option) are probably savvy enough to detect BS and lies in political ads (which isn't to say they're immune from influence, but they're less likely to be swayed).
posted by asnider at 2:58 PM on January 10, 2020 [4 favorites]


Scammers allow people who detect misspellings and bad grammar to opt out of their scam emails, news at 11.

Facebook is just boosting the ROI for false advertisement so they can make even more money per sucker on this.
posted by tclark at 3:02 PM on January 10, 2020 [15 favorites]


If you want to read more reasons why people don't leave Facebook, here's just one such prior thread on Facebook Politics.

Back to this thread: I particularly like this comment quoted in the Ars Technica article:
Federal Election Commissioner Ellen Weintraub had particularly harsh words [tweet] for the company. "Facebook's weak plan suggests the company has no idea how seriously it is hurting democracy," she wrote on Twitter. "Here, proposing 'transparency' solutions is window-dressing when Facebook needs to be putting out the housefire it has lit."
And from the end of the article:
"Make no mistake, this has nothing to do with transparency and choice," said [tweet] Rep. David Cicilline (D-RI)—chairman of the House Antitrust Subcommittee, which is probing Facebook's business practices [Ars Technica]. "This is about money. Specifically, the $6 billion that will be spent on political ads in 2020 that Facebook will use to continue increasing their profits at the expense of our democracy."

Facebook did gain support from one powerful quarter for its plan, however. A spokesman for the Trump campaign applauded the maneuver, saying , "Our ads are always accurate, so it's good that Facebook won't limit political messages."
Also, if you have to opt-out, that means you don't want to see the ads. Which means people in the Trump Bubble who want to see lying ads that support their distorted worldview can keep getting fed lies. Death to democracy, indeed.

Some more details:

Facebook Doubles Down on Refusal to Block Political Ads – Even Fake Ones (Courthouse News)
Doubling down on the policy in a blog post, Facebook executive Rob Leathern said the company considered going the way of limiting targeted ads like Google, or banning them altogether like Twitter, but found them too valuable a tool for political groups and others.

Leathern said Facebook’s data found that 85% of spending by U.S. presidential candidates on Facebook went to targeted ad campaigns.
...
Facebook also unveiled updates to its Ad Library search options and ad preferences for users. One new feature will allow people to see fewer political and social issue ads on Facebook and Instagram, which Facebook called a “common request we hear from people.”
...
While Leathern said Facebook doesn’t believe private companies should be calling the shots on political ads, in the absence of government regulation companies have had to craft their own policies.
Government regulation, you say? Seems like Facebook isn't too fast to move when Governments tell them to (but they'll get around to it): Under Irish law, foreign citizens and groups are not allowed to make donations to Irish campaign groups. However, foreigners were until Tuesday [May 8, 2018?] able to purchase Facebook ads directly targeting Irish voters. (CNN, May 8. 2018)

Facebook finally blocked UK and US ads, and then the Irish law passed (Wikipedia) ... by a landslide (The Guardian).
posted by filthy light thief at 3:06 PM on January 10, 2020 [14 favorites]


First step is to repeal the Section 230 exemption to the Communications Decency Act that gives Facebook and Google absolute immunity to liability for anything they publish, no matter how vile, deceitful, dishonest or libelous.

Facebook and Google should face the same risk of liability as newspaper, magazine or TV publishers. They claim they can't police their postings but I guarantee you that if they could be sued, they would figure out a way.
posted by JackFlash at 3:09 PM on January 10, 2020 [16 favorites]


First step is to repeal the Section 230 exemption to the Communications Decency Act that gives Facebook and Google absolute immunity to liability for anything they publish, no matter how vile, deceitful, dishonest or libelous.


This would annihilate the internet as you know it, leaving only huge corporations to survive.

This would destroy any website that allows user-generated content or comments, including Metafilter, by opening them up to crushing liability for anything a user posts. Google and Facebook might be large enough to survive, because unlimited resources means unlimited lawyers, but your other favorite websites will be killed the day this passes.
posted by Hollywood Upstairs Medical College at 3:21 PM on January 10, 2020 [41 favorites]


Can I also opt out of seeing political ads on TV?
posted by Ampersand692 at 3:26 PM on January 10, 2020 [3 favorites]


The problem isn't that Section 230 protects FB and other sites from responsibility for submitted content. The problem lies with the content that FB promotes, either because they are paid to (like political advertising) or because their algorithms otherwise deem it worth promoting. If nothing else changed except that FB was actually held accountable for the content that it specifically promoted, as paid advertising or otherwise, the issue would largely be solved.
posted by Two unicycles and some duct tape at 3:30 PM on January 10, 2020 [11 favorites]


This doesn't seem like the ideal place for a substantive discussion of section 230, but there's plenty of daylight between the wildly extended version of it that tech lawyers have successfully foisted on the courts and a hypothetical regime of strict liability for UGC. Also section 230 is law in less than 1% of the world's countries yet many of the others seem to be getting on OK.
posted by Not A Thing at 3:34 PM on January 10, 2020 [5 favorites]


I mean, how could Facebook tell politicians not to lie on Facebook when Facebook is partnering with "legacy media" outlets to lie about Facebook?

Article or ad? Teen Vogue removes glowing Facebook story without explanation.
Facebook initially denied it had paid for the post, calling it “purely editorial.” But in a statement to The Washington Post, the company elaborated, saying Facebook had “a paid partnership” with the magazine for its 2019 Teen Vogue Summit in Los Angeles and the arrangement included sponsored content.

“Our team understood this story was purely editorial, but there was a misunderstanding,” the Facebook statement said.

In a statement to The Post, Condé Nast, which owns Teen Vogue, apologized for the maelstrom but did not elaborate or answer specific questions about the article’s inception and whether it had been a latent advertisement.
Teen Vogue Sponcon Fiasco Puts Spotlight on Facebook Stooges
The five people Facebook chose to highlight in the Teen Vogue ad are all women—one Republican operative, one who spent her career among Clinton Democrats, one who worked at McKinsey, a former special-education teacher, and a data scientist with a PhD in sociology. Here’s what we know about them.
posted by tonycpsu at 3:41 PM on January 10, 2020 [4 favorites]


This would destroy any website that allows user-generated content or comments, including Metafilter, by opening them up to crushing liability for anything a user posts.

This attempt to defend Section 230 as it exists today via scare tactic is tiresome. As was pointed out above, there is a wide variance between the blanket indemnification that the tech industry has engineered and complete removal, and as has been pointed out in other threads, what people want to see is the blanket pulled back so that, for example, Facebook can't argue that it's not at fault when it provides ad buyers the tools to engage in discrimination (which it has done under Section 230 as it stands.) Nobody is looking to make Metafilter liable for the comments of its users just by allowing them to post, and it is the height of bad faith argumentation to claim that we are.
posted by NoxAeternum at 4:02 PM on January 10, 2020 [5 favorites]


First step is to repeal the Section 230 exemption to the Communications Decency Act that gives Facebook and Google absolute immunity to liability for anything they publish, no matter how vile, deceitful, dishonest or libelous.

A complete repeal wouldn't be so great, there's a reason why this is the only part of the CDA that survived Ye Olde Ribbon Banner Campaign. You don't want to swing all the way in the other direction (e.g. making Metafilter liable in tort for defamatory posts by users) — but it is pretty dire need of reform.
posted by snuffleupagus at 4:15 PM on January 10, 2020 [1 favorite]


“We made a shit-ton of money off the last election, and we gotta pay all these obscene Silicon Valley rents somehow.” - Facebook
posted by egypturnash at 4:22 PM on January 10, 2020 [2 favorites]


I think it's a mistake to look at this decision as primarily profit or greed motivated. In one of the previous MeFi discussions about Facebook and political ads, I spitballed the amount of revenue Facebook made from political ads in the last couple of campaigns. It's well under 1% of total ad revenue.

I tend to agree with Roose's editorial that we should believe Zuckerberg when he says “political ads are an important part of voice.” I think he buys fully into the view that Facebook is an important part of American democracy and that political ads are part of political speech. There's also a healthy dose of Facebook being afraid to have to actually fact check ads; it's an impossible task.

But I also think the idea that Facebook is central to American political discourse is horseshit. Or maybe something worse. Facebook is a insidious and dangerous discussion channel. To the extent it is part of our discourse it is a virulently harmful one. Triply so with political ads, which are carefully designed and optimized to most influence people irregardless of rational thought or discussion. You think grandpa forwarding gun nut memes is bad? Marketing experts carefully crafting specific gun nut ads for grandpa to forward is way, way worse.
posted by Nelson at 4:46 PM on January 10, 2020 [6 favorites]


It's well under 1% of total ad revenue.

Per Mark Zuckerberg in an earnings call, "We estimate that these ads from politicians will be less than 0.5% of our revenue next year [in 2020]."
posted by saeculorum at 5:01 PM on January 10, 2020 [4 favorites]


I've heard of dictators in power banning the opposition from campaigning, but this is the first time I've heard of the opposition trying to ban the incumbent from campaigning.
posted by save alive nothing that breatheth at 5:07 PM on January 10, 2020


I tend to agree with Roose's editorial that we should believe Zuckerberg when he says “political ads are an important part of voice.” I think he buys fully into the view that Facebook is an important part of American democracy and that political ads are part of political speech.

No, I don't think that's it at all. Instead, I'd argue it's two things:

* First, we have the Silicon Valley culture built around free speech "absolutism", which as discussed before tends to be incoherent and open to abuse. Zuckerberg and the rest of Facebook leadership live in a culture that praises abuse done through words and condemns protecting the week by not allowing such abuse - and boy does it ever fucking show.

* Second, we have their willingness to be handmaidens to racism, white supremacy, and fascism (if not all in on the above.) We have Andrew Bosworth's comment about Trump winning because "he ran the best digital ad campaign" while covering up exactly how said campaign was run (and how Facebook gave them those particular tools.) He also makes this statement as well (denoted in this Atlantic piece):
Bosworth extended his logic to the rest of the platform too, arguing against “limiting the reach of publications who have earned their audience, as distasteful as their content may be to me.”
posted by NoxAeternum at 5:13 PM on January 10, 2020 [2 favorites]


Facebook is now a platform of political disinformation and propaganda where the good ads can't balance the bad ones because they all become suspect in the mix. It's not a spectrum of free speech, but a spectrum of confusion, no different than letting crowds shout down someone at a debate.
posted by Brian B. at 5:48 PM on January 10, 2020 [3 favorites]


It's not even a discussion about free speech, in that Facebook regularly removes any criticism of it on its platform, for instance. Free speech doesn't even play any part in this, at all, except when execs need to rationalize making ad revenue off of shady third parties.
posted by They sucked his brains out! at 5:53 PM on January 10, 2020 [3 favorites]


"Our ads are always accurate, so it's good that Facebook won't limit political messages."

"We oppose a crackdown on cheaters because we never cheat."

Seems legit.
posted by straight at 6:08 PM on January 10, 2020 [5 favorites]


> The people that need to opt out of the ads (ie your grandfather) will never figure out how to do it. Facebook is counting on this

My grandparents have been dead for decades. No need to get ageist about this. I know plenty of gullible young people.
posted by The corpse in the library at 6:12 PM on January 10, 2020 [16 favorites]


This would destroy any website that allows user-generated content or comments, including Metafilter, by opening them up to crushing liability for anything a user posts.

That's pretty nonsensical. Metafilter is already heavily moderated, and every user is also available to flag suspect material. There's nothing there that is particularly libelous. Opinions and satire are protected.

And what would they sue for? I don't think there is a secret cortex mcduck basement full of gold coins. Virtually 100% of metafilter income goes into moderator salaries. I suppose theoretically someone could sue for possession of the metafilter URL, but you could re-open the next day with NewMetafilter.com. Nobody is going to sue metafilter.

On the other hand, Facebook netted over $25 billion last year. You would think they could afford a few moderators, if they only had a reason to hire them. They don't with immunity.
posted by JackFlash at 6:31 PM on January 10, 2020 [6 favorites]


I tag military recruitment ads as "promoting violence" and "hate speech" when I get them.
posted by symbioid at 6:38 PM on January 10, 2020 [7 favorites]


This is why adblockers should come preinstalled on browsers.
posted by signal at 6:41 PM on January 10, 2020


And what would they sue for?

Retribution. Or, Silence. This is why SLAPP laws exist (though they can be abused too).

EFF on CDA § 230.
posted by snuffleupagus at 6:42 PM on January 10, 2020 [2 favorites]


Retribution. Or, Silence.

How does that work? If the corporation has no money what can they sue for? Metafilter is a corporation. You can't sue the owners.
posted by JackFlash at 6:47 PM on January 10, 2020 [1 favorite]


> that means you don't want to see the ads. Which means people in the Trump Bubble who want to see lying ads that support their distorted worldview can keep getting fed lies.

This is only beneficial to the advertiser (and, by extension, the one selling the advertising). The whole point of this exercise is very, very targeted advertising and giving a nice way to get rid of anyone who doesn't want to see it is great (for the advertiser) in the sense that those views that they are paying for will no longer be "wasted" but will be available to go towards other, new potential marks.

So providing a way for people to opt out is not a way to "fix" the problem but in fact a way to "fix" the advertising platform for political ads so that it will be even more effective and efficient--at spreading lies and misinformation, and generating $$$--than before.

Regardless of whether this type of advertising is 0.00005% or 0.5% or 5% or 50% of their revenue, it's really telling that the only type of solution they can come up with is one that is completely self-serving.
posted by flug at 6:50 PM on January 10, 2020 [4 favorites]


How does that work? If the corporation has no money what can they sue for? Metafilter is a corporation. You can't sue the owners.

You sue to put it out of business, practically speaking — potentially purely through the cost of litigation without caring if you recover on any judgment you may or may not ever obtain. Corporations can't appear in court without counsel, so it starts adding up quickly. If they do default, you're in an even better position to put them out of business.

And you can try to sue the owners under an alter-ego theory, which requires them to be defended as well.
posted by snuffleupagus at 6:51 PM on January 10, 2020 [5 favorites]


It's well under 1% of total ad revenue.

Per Mark Zuckerberg in an earnings call, "We estimate that these ads from politicians will be less than 0.5% of our revenue next year [in 2020]."
posted by saeculorum at 5:01 PM on January 10 [1 favorite +] [!]


What's the revenue impact of getting the type of tax-and-regulation-averse conservatives that would lie elected?
posted by kzin602 at 6:51 PM on January 10, 2020 [3 favorites]


This is why adblockers should come preinstalled on browsers.

Is there a good ad blocker for Firefox that blocks just Facebook ads? I use uBlock Origin to block most ads. But Facebook is super aggressive and sneaky about how it embeds its ads and uBlock Origin seems to mostly fail at blocking them.

I see there's a bunch of Facebook-ad-specific blockers. Is there one that works and is trustworthy?
posted by Nelson at 6:52 PM on January 10, 2020


You sue to put it out of business, practically speaking

You walk away and the next day you open with NewMetafilter.com.

These fantasies of being sued out of business are highly overrated.
posted by JackFlash at 6:58 PM on January 10, 2020


SESTA/FOSTA repealing liability immunity for third party posts regarding prostitution, ostensibly to fight sex trafficking on backpage.com even when existing laws took down backpage.com, already has had a chilling effect on internet discourse about sex work, and seems ti be be responsible for greater risk for sex workers of all types.

Disadvantaged groups would likely face real harm from a removal of the ban. Companies would cover their ass, and silencing voices advocating for disadvantaged groups is almost always easier and cheaper then effectively dealing with abuse and discrimination.

I wonder about state attorney generals possibly suing the purchaser/creator of the false political ad directly though. They have no immunity under 230 for ads they create as I understand things.
posted by gryftir at 6:59 PM on January 10, 2020 [7 favorites]


You walk away and the next day you open with NewMetafilter.com.

These fantasies of being sued out of business are highly overrated.


Sure, lemme just pull up NewGawker.

It's interesting how people want to pretend chilling effects or weaponized litigation don't exist only as pertains to companies they'd also rather not exist, anyway.

Dealing with this as false advertising makes a lot more sense to me than doing away with the safe harbor for user's speech.
posted by snuffleupagus at 7:05 PM on January 10, 2020 [10 favorites]


Right? Sure, 100s of thousands of users will just pick up and move to newmetafilter.com no questions asked. Can you hear my eyes rolling from there?
posted by RustyBrooks at 7:13 PM on January 10, 2020 [2 favorites]


Gawker was sued for violation of privacy -- under existing law, by the way -- after stubbornly refusing a take down notice.
posted by JackFlash at 7:13 PM on January 10, 2020


That is really not the point. You said getting sued into the ground doesn't happen. What will the existing law be if 230 is done away with?

I've said my bit, you're on record as saying people should just be sued, lose everything if they can't afford to defend themselves and then start new websites no one will ever visit from their tents or whatever.

We'll have to agree to disagree.
posted by snuffleupagus at 7:17 PM on January 10, 2020 [1 favorite]


The lesson of gawker is don't publish pilfered sex tapes, and when politely asked to take them down, you should comply, and when a circuit court judge further tells you to take it down, don't tell the judge to just fuck off.

The gawker case has nothing to do with Facebook and liability for libel.
posted by JackFlash at 7:24 PM on January 10, 2020 [1 favorite]


It's CDA § 230 that makes them different. But, hey, go ahead and give people like Donald Trump a hecklers veto over the Internet, I'm sure you'll love the results. That'll be way better than putting some sensible limits on the safe harbor.
posted by snuffleupagus at 7:34 PM on January 10, 2020


The New York Times and the Washington Post and thousands of other publications, large and small, seem to survive quite well without a Section 230 exemption. I'm not sure why Facebook needs one.
posted by JackFlash at 7:38 PM on January 10, 2020 [1 favorite]


And there you have it. The problem is Facebook, and we should be careful about nuking 230 just to deal with Facebook. Bad facts make bad law.

Let me reiterate, 230 needs reform. It has perverse incentives, such as you can't moderate too much or you might become responsible for the user content. It was always clumsy, and now it's downright creaky and harmful, like the way outdated standards in the Stored Communications Act threaten email privacy.

But the way to protect email privacy would not be to just throw away the whole idea of it, and the same thing with the safe harbor and the expression we do value and would not want to see squashed or even threatened by anyone with a deep enough pocket to maintain a strategic lawsuit.
posted by snuffleupagus at 7:44 PM on January 10, 2020 [3 favorites]


I probably should have said more explicitly, the difference is that NYT and other newspapers are publishing their own reporting or editorial, and are responsible for it on that basis, and they also have other Constitutional protections for traditional press journalism.

On the other hand, 230 is about user-contributed content and the liability of its website host. It's why websites — whether large or small, loved or hated — can provide platforms for user content without the same liability for what users say that a newspaper has for what its reporters say. (The NYT does benefit from 230 in some ways on its own website, such as not being responsible for what users say in its comments section.)

Maybe here we'd mostly all rather Facebook just go away completely at this point — but to the extent social networking is going to exist, 230 protects the existence of those spaces as it does internet message boards, blog hosts and other platforms. And it doesn't just protect them for the companies running them, it protects them for their users.

So, I'd prefer to see it reworked and improved to recognize that Facebook is not as simple, transparent or neutral a content host as the kinds of websites 230 was written to cover, but rather is an algorithmically driven curator of user content that it blends into its direct marketing platform in its delivery; and so weaken the safe harbor as it applies to FB or companies that act like FB on that basis.

FB is not just a message board that also hosts ads, we can keep the good parts of 230 without letting FB continue to maintain that fiction.
posted by snuffleupagus at 8:40 PM on January 10, 2020 [5 favorites]


These fantasies of being sued out of business are highly overrated.

That's one of the most ill-informed comments I've read on Metafilter in a long time.
posted by Nelson at 8:51 PM on January 10, 2020 [3 favorites]


I wonder about state attorney generals possibly suing the purchaser/creator of the false political ad directly though. They have no immunity under 230 for ads they create as I understand things.


One thing that we should ask ourselves before encouraging more power to state AGs is to ask ourselves how the least scrupulous, deranged right-winger would use such power. They do get elected, after all.

Setting a precedent that they can sue opposing politicians over the truthiness of what they said opens up a world of dangerous possibilities.
posted by Hollywood Upstairs Medical College at 11:18 PM on January 10, 2020 [2 favorites]


It's so frustrating, even journalists I generally admire, such as Molly Wood of Marketplace, parrot back Facebook talking points. I was listening to Marketplace Thursday and she talked about how it wasn't Facebook's place to filter or judge content, but they ALREADY do that. You post something legal and true that contains a 'female presenting nipple' and it gets taken down! Content related to sex work is already highly censored! They do this work already, there's already a filter in place and it's totally voluntary! Now I get some people might say there's a difference, but my point is that they've already ceded the 'refuse to filter content' plank, as they do it voluntarily.
posted by Carillon at 12:31 AM on January 11, 2020 [5 favorites]


On the other hand, 230 is about user-contributed content and the liability of its website host.

No, it isn't anymore. It has become blanket indemnity as long as the host can create some tenuous connection to end users. The result is that we get things like Facebook arguing that they have no legal liability for providing rental property advertisers the tools to create discriminatory ads; legally protected revenge porn websites, and other ludicrous manipulations of the law.

Blanket indemnity is bad policy. It's time for the blanket to be rolled back.
posted by NoxAeternum at 1:05 AM on January 11, 2020


I've heard of dictators in power banning the opposition from campaigning, but this is the first time I've heard of the opposition trying to ban the incumbent from campaigning.

I don't think I understand what you mean by that, can you help me out?
posted by PMdixon at 6:12 AM on January 11, 2020 [2 favorites]


the good parts of 230

What are "the good parts," as you see them, and why do you think transmission of data across networked computers requires liability carve outs that other communications media don't?
posted by PMdixon at 6:25 AM on January 11, 2020


I’ll note that Washington state has a pretty robust public disclosure law which requires ALL political advertising to include which political committee paid for it and their top 5 donors. The state attorney general sued some internet companies for not complying. Facebook, instead of complying and just displaying the information, chose to stop taking political ads for WA candidates or campaigns. Though they aren’t very good about refusing them so plenty of local campaigns still buy Facebook ads. Anyway the WA state system is pretty good and if we had that nationally it would be a vast improvement.
posted by R343L at 7:08 AM on January 11, 2020 [2 favorites]


Facebook's Ad Library is pretty good disclosure. I don't know the niceties of Washington law, but Facebook built a whole product for journalists to look at ad spending on Facebook. Who's buying ads, what the ads are, details of who's shown the ad by gender, location, age.. I don't think this solves all the harm political advertising on Facebook can do, but it is a remarkably comprehensive disclosure and I think of some use. There were complaints at launch it didn't work very well, don't know if those problems have been fixed or not.

(random example of a thing you can learn; Tulsi Gabbard and Tammy Duckworth have been running lots of ads about Iran these past few days. As best as I can tell, the Trump people have run zero. I guess they don't see assassinating Suleimani as a thing that'll promote his campaign.)
posted by Nelson at 7:40 AM on January 11, 2020


why do you think transmission of data across networked computers requires liability carve outs that other communications media don't?

That is a really bizarre way to re-frame the question. CDA § 230 is not just some FCC regulation.
posted by snuffleupagus at 7:48 AM on January 11, 2020


I see there's a bunch of Facebook-ad-specific blockers. Is there one that works and is trustworthy?

Like most one-person-show extensions, it might someday stop being updated or turn terrible, but I've had good experiences with F.B. Purity.
posted by box at 7:51 AM on January 11, 2020 [1 favorite]


other communications media

If really wanting to lean on this comparison, I don't think you'd want your phone company to be legally responsible for what you say on the phone, and therefore be compelled to monitor and police the content of your phone calls to make sure they've discharged their duty of care in providing you access to the network. What's that you say? A website isn't a phone call? Well, it isn't a newspaper either. It isn't a TV station. Or a book publisher.

Nor do we regulate newspapers like TV stations, or TV stations like book publishers just because they are all "communications media."

The digital commons we all value (at least I thought we did...) requires websites to have some protection from liability for user conduct as common carriers (if emphasizing the 'data transmission' angle), even though they are largely commercial enterprises and not exactly like anything else we had beforehand.

I've already said I think the law needs to be reformed as to its applicability to companies like FB who obviously analyze, exploit, manipulate, mine etc user data, manipulate the user experience to drive engagement, and re-sell the results by offering to micro-target advertising that they refuse to vet. Life on the Internet has changed significantly since the CDA was passed.

It's tricky to deal with the reality that platforms like Facebook/Insta and Twitter and Youtube (aka Google) are pseudo-public spaces of some kind (as recognized by Trump twitter block case) and among the most widely used platforms for user expression, but are also exploited by the platform owners in ways that should not be protected.
posted by snuffleupagus at 8:20 AM on January 11, 2020 [4 favorites]


Were I God-Emperor, I'd mostly limit safe-harbor to (almost-)dumb pipes.

A facebook that just blindly shows you what your friends posted in the order or reverse-order they were posted, with at most minimal censorship for blatantly illegal stuff like scanning for kiddie-porn? Safe-harbor.

Facebook wants to decide what to show me? Those words and images become facebook's words and images, because they're not just piping stuff to me, they're actively deciding what to publish to me.

I'd allow various terms of service to fall under safe-harbor, but the terms of service would have to be posted publicly, have some reasonable element of user-neutrality and not be that angry young white men get to do whatever, be actually enforced, and be enforced uniformly. Judge or jury holds/decides that you let a favored user get away with violating your posted TOS because they bring lots of eyeballs? Users' words are your words.

As a secondary issue, I'd establish that a web page is responsible for ads and other sponsored content shown on its page. Your url at the top in any of the top three browsers as modally installed? You're responsible for the ads etc, even if they're being directly provided by some ad network. Malware in it? You're criminally responsible, and maybe have a tort against the ad network or originating ad provider. Political ad or sponsored content funded by a non-US source? You're responsible.
posted by GCU Sweet and Full of Grace at 8:34 AM on January 11, 2020 [11 favorites]




The dumber the pipe, the safer the harbor is a pretty good design principle.

It could also be less binary — maybe the burden of proof flip-flops past a certain point of platform curation, before the protection is lost entirely. Maybe the duty of care that arises depends upon different factors like platform size, platform transparency, privacy policies, or where the platform derives its revenue from....

Criminal liability for malware shouldn't arise from mere negligence. You'd need at least some kind of recklessness standard, and clarity on what due diligence is required to satisfy it. Practically speaking, the civil liability should be crushing enough. It would be insurable, but that's probably for the best (considering how it works with data breaches currently).
posted by snuffleupagus at 8:49 AM on January 11, 2020 [3 favorites]


The digital commons we all value (at least I thought we did...) requires websites to have some protection from liability for user conduct as common carriers (if emphasizing the 'data transmission' angle), even though they are largely commercial enterprises and not exactly like anything else we had beforehand.

Again, the point that you have been studiously avoiding is that as it stands today Section 230 does not provide some liability - starting with the horribly tech-ignorant Batzel ruling, it has slowly been twisted into blanket liability for online service providers. The result is that you get absolutely asinine arguments like "no, Section 230 means that we're not liable for violating the Fair Housing Act for allowing users to post discriminatory housing ads, because those ads are 'user conduct'," which in turn effectively guts the FHA, because now the government has to play a neverending game of whack-a-mole instead of being able to stop the matter at the source.

Nobody is arguing that Metafilter should be held liable for what we post here, and arguing that point is dishonest. What people are pointing out is that a principle of "anything that was touched by users is indemnified" is a blanket policy that covers a wide range of abusive practices that should not be protected.
posted by NoxAeternum at 9:39 AM on January 11, 2020


I am not making the arguments you're ascribing to me, please point out where I support "blanket indemnification" or say Batzel was a good holding. Or that 230 should survive untouched. Otherwise please stop. You're the one being disingenuous here.
posted by snuffleupagus at 10:01 AM on January 11, 2020 [4 favorites]


I am not making the arguments you're ascribing to me, please point out where I support "blanket indemnification" or say Batzel was a good holding. Or that 230 should survive untouched.

You're right - you don't say them explictly. You just link to the EFF's position on Section 230 (where they openly praise Batzel as a good ruling) and come in arguing that people who are pointing out that Section 230 has become blanket indemnification and want to roll that back are looking to make all online providers liable for the conduct of their end users completely. Someone asked you to point out what you consider to be the parts of Section 230 that should be retained and why, and you responded by calling the question bizarre, instead of an attempt to figure out what parts of Section 230 indemnification you consider important.
posted by NoxAeternum at 10:25 AM on January 11, 2020


Re-framing it as all just 'data transmission' was the weird move there. Otherwise I think I've made my position sufficiently clear, we seem to agree what you're calling a "blanket indemnity" is obsolete, I've suggested some ways the law might be changed, and with that I really don't care to continue this derail.

If you don't like my ideas, maybe put forward your own.
posted by snuffleupagus at 10:42 AM on January 11, 2020 [3 favorites]


Section 203 of the CDA isn't even relevant here, because Facebook's business model and platform are unrelated to operating an environment for the exchange of free speech. Facebook and subsidiaries like Instagram regularly delete content at a whim, only using the First Amendment when its management decide to show up at Congressional hearings and justify making revenue from false advertising. There is a clear distinction between publishing content and publishing ads, which is why laws regarding false advertising exist in the first place. The problem is that Facebook has argued that it is above those laws, which seem to apply to every other ad-revenue-generating media entity in the United States, but not to Facebook, for some reason.
posted by They sucked his brains out! at 11:01 AM on January 11, 2020 [1 favorite]


Section 230 is how Facebook argues that it is above the laws regarding publishing ads, by stating that said ads are "user content" under it, and thus they hold no liability for the ads.
posted by NoxAeternum at 11:44 AM on January 11, 2020


What are "the good parts," as you see them, and why do you think transmission of data across networked computers requires liability carve outs that other communications media don't?

The notion that platforms are carriers for user content seems generally sound and practical. The notion that paid advertising should be considered "user content" much less so.
posted by atoxyl at 12:06 PM on January 11, 2020 [1 favorite]


Generally ain't laws against false political advertising, but having a quick look I was surprised to see that some state-level governments have tried them to not much effect - a 2004 article, made more valuable by predating the current confusion.

I don't think I understand what you mean by that, can you help me out?

That they're giving commands without authority or power. All Facebook has to do is not flinch.
posted by save alive nothing that breatheth at 12:29 PM on January 11, 2020


Nelson: "Is there a good ad blocker for Firefox that blocks just Facebook ads? I use uBlock Origin to block most ads. But Facebook is super aggressive and sneaky about how it embeds its ads and uBlock Origin seems to mostly fail at blocking them.
"

Don't know about Firefox. I use Social Network Adblocker on Chrome, and I'd forgotten that there where ads on Facebook at all.
posted by signal at 3:23 PM on January 11, 2020 [1 favorite]


I took a deeper look at the Facebook-specific ad blockers for Firefox and didn't like any that I saw. Many of them seem sketchy; a few hundred downloads, a year old, etc. I'm using FB Purity for now and so far so good. It's way overkill, it's a whole "let's redesign Facebook by turning off all its features!" But the default setting is mostly just ad blocking and some other minor things, it seems OK.
posted by Nelson at 4:22 PM on January 11, 2020


Federal Election Commissioner Ellen Weintraub had particularly harsh words [tweet] for the company. "Facebook's weak plan suggests the company has no idea how seriously it is hurting democracy," she wrote on Twitter.

Facebook founder Mark Zuckerburg has sat in Congressional hearing in which he was told how seriously his company is hurting democracy, not to mention the innumerable op-eds, tweets and posts on Facebook itself with the same message.

They know perfectly well, and they're fine with it, because it makes them money. They don't deserve the benefit of the doubt.
posted by Gelatin at 10:14 AM on January 12, 2020


Facebook and Google should face the same risk of liability as newspaper, magazine or TV publishers. They claim they can't police their postings but I guarantee you that if they could be sued, they would figure out a way.

That claim is a lie. Facebook polices nudity in its postings (it has been criticized, in fact, for excessive heavy-handedness in doing so), and Google Image Search is smart enough to remove nudity and adult content from search results as well.

I'm fascinated, by the way, from the admission implicit in the claim that banning outright lying is keeping one side from campaigning at all.
posted by Gelatin at 10:21 AM on January 12, 2020 [2 favorites]




The UK Election Showed Just How Unreliable Facebook’s Security System For Elections Really Is. Apparently their Ad Library product is really not working very well.
posted by Nelson at 9:51 AM on January 17, 2020


Bloomberg News: George Soros Says Facebook Is Conspiring to Re-Elect Trump

"I think there is a kind of informal mutual assistance operation or agreement developing between Trump and Facebook,” Soros, 89, said Thursday at the World Economic Forum in Davos, Switzerland. “Facebook will work together to re-elect Trump, and Trump will work to protect Facebook"
posted by They sucked his brains out! at 11:14 AM on January 25, 2020 [1 favorite]


« Older Suddenly You Were Gone, From All the Lives You...   |   PS752 Newer »


This thread has been archived and is closed to new comments