Public Squares or Private Premises
August 19, 2018 7:43 PM   Subscribe

Alex Jones, the First Amendment, and the Digital Public Square - "How should we challenge hate-mongering in the age of social media?"
"The platforms are media companies whether they admit it or not."
Should platforms be regulated? A new survey says yes. - "8 in 10 Americans agree social media companies should be subject to same rules & regs as newspapers and TV which are responsible for the content they publish."

Social media must curb extremists and abusers - "Action against Infowars is only the start for Facebook and Apple."

Why Twitter should kick Alex Jones and others off its network - "For social networks to function in a civil society, there have to be rules of behaviour. And if legislatures won't set them, social networks must. Both Facebook and Twitter already allow or ban content based on rules they make up. There is little by way of public oversight on such rules, other than the bare minimum legal standards of defamation or prosecutable hate speech."

Twitter's Misguided Quest to Become a Forum for Everything - "An online community at this scale has perverse and dangerous consequences."

The Internet is not a Public Park - "These are private platforms that open their doors to all of us."

A Lockean Theory of Digital Property - "Use-rights appropriation isn't just Marxist bullshit; John Locke's labor theory of property also applies: An individual owns their own labor, and has a right to any mixing of their labor with property. I don't know about Alex Jones, but I put a lot of work into my Twitter account! Nine years of careful supervision, to be exact. Witticisms needed my re-broadcasting, funny memes my endorsement. I tilled the soil, sowed the seeds, trimmed the flowers, watered the weeds. Today I have 20,000 followers. They may not matter to Twitter, but they matter to ME. If Alex Jones worked as hard on his accounts as I have on mine, I daresay he's earned the right to continue using his plot of virtual real estate."

Inside Twitter's Struggle Over What Gets Banned - "Mr. Dorsey stroked his beard and nodded."

Alex Jones and the Bigger Questions of Internet Governance - "The question remains: how should they govern the speech of their users?"

Social Media Giants Shouldn't Be Arbiters of Appropriate Speech - "It's implausible to imagine a future in which liberal activists don't demand that right-of-center groups be de-platformed."

Speech and Power - "The answer is take power away from the platforms so that competition is possible. Rather than antitrust, which is an industrial age tool, I would like to see the large platforms become programmable via APIs... It would be a mistake though to think that these kind of systems will automatically lead to good outcomes for speech (or for anything else)."
"The concluding chapter of Neil Postman's Technopoly was titled 'The Loving Resistance Fighter'. It featured a list of conveniently tweet-sized characteristics of those who resist American Technopoly."
also btw...
-Who can we trust?
-Automattic and speech
-In Support of Free Speech
-The Tech Backlash We Really Need
-What all those US newspapers are getting wrong
-How to Discuss the Far Right Without Empowering It
-Why I'm Not Worked Up About "Fake News" And Why I Am
-Analysis: Facebook doesn't have an obligation to support journalism
posted by kliuless (92 comments total) 50 users marked this as a favorite
 
Once again, consider pornography. Porn is protected speech (in the US at least), yet most major social media platforms have restrictions on porn. If we agree that it’s good and right for social media platforms to restrict porn—and I think it is, despite being 100% pro-porn—it’s not a leap to say that restricting hate speech and extremism is also good and right.

To put it another way, why is it okay to ban ISIS accounts but not Neo-Nazis?
posted by SansPoint at 8:03 PM on August 19, 2018 [66 favorites]


To put it another way, why is it okay to ban ISIS accounts but not Neo-Nazis?

Angry white men are not like other people, they're white men.

Otherwise, I think your analogy stands.

kliuless, thanks for this post and the collection of links!
posted by filthy light thief at 8:15 PM on August 19, 2018 [14 favorites]


This CNN interview with @jack made me sick.
posted by glonous keming at 8:18 PM on August 19, 2018 [1 favorite]


He really is an extraordinary dipshit.

So he is thinking about how to help users follow topics and hashtags, not just people.

No Jack, fuck the hell off. Give me the people I follow in a chronological timeline and not some half assed algorithmic bullshit trying to sell me the Nazi point of view.

Then there’s the dumb shit about bring Ng left leaning, which... where? When??? Best interpretation is he actually means “libertarian”, which in this cntext means pro-Nazi.
posted by Artw at 8:28 PM on August 19, 2018 [10 favorites]


It was hard for me to truly appreciate how awful Twitter is until I started using Mastodon in earnest. All the negativity? Just... gone. Poof. I want to take Twitter users by the lapels and tell them "it doesn't have to be this way!"
posted by AFABulous at 8:50 PM on August 19, 2018 [3 favorites]


If the government is not suppressing his speech, this is not a First Amendment issue. What we write and say in other forums is subject to consequences. For instance, I have the right to say, "Fuck Nazis and all the white supremacist Alt-right pieces of shit that they spawned," and maybe somebody would be offended. Which would be just fine with me.
posted by Chuffy at 9:14 PM on August 19, 2018 [4 favorites]


Censorship has always been the coward's tool for keeping out inconvenient realities to fit a deceptive narrative.

I do not like Jones. I never understood the Right and Left's obsession with him because he isn't even fun, but when you run away -- and censorship is a form of running away -- you are less informed of the true landscape out there. I would rather have a Neo-Nazi tell me to my face what he thinks about me than sweep it under the rug and pretend he doesn't have a knife to my back when it is turned to him. I like to be informed of who I am dealing with, and the more they say things, the better I understand their strategies and thinking patterns so I know how best to proceed.

People suddenly want to control the flow of opinion, but you cannot do that. You are building a wall to keep an element you do not like out of your backyard, even it is also their backyard.

Big Tech can ban these people, and they will create another venue, and the act of organizing will make that sect stronger, not weaker. You are not going to make them go away, shut up, change their minds, or break to your will. It is the same if you hide a giant pit filled with snakes with flowers so you do not have to face the reality that there is a pit filled with snakes, and then someone else walks right into it.

Big Tech is merely proving they are so frail, that they have to build a fortress because their own preferred ideologies fail in comparison. Really? I have had to deal with bigots and sexists my entire life, and I have never had a problem letting them spew because they can neither move me nor stop me. I was four-years-old when I told off men like Jones on my own initiative. I kicked people like that in the shins despite grave parental disapproval.

We are witnessing the decline of Big Tech as a power, and it is baffling to think that Jones is that much of a threat to them. They have just elevated and legitimized him as being stronger than they all are put together, and that is his only draw. He has gotten real importance and all he needs to do is tweak a few uppity noses, and now he is seen as a force so powerful that billionaires have to build a snowflake shelter for weaklings. They just paid a big compliment to Jones, and could not be more insulting to his detractors by infantilizing them and serving as their self-appointed nannies. Big Tech must be completely dysfunctional and in a panic right now.
posted by Alexandra Kitty at 9:16 PM on August 19, 2018 [5 favorites]


I dunno, the easiest way to think of it is that Dorsey is a scumbag, so if he disagrees with something then the odds that thing is the best goodest has just gone up by 15-35%.

...just trying to normalize the idea that Dorsey is a scumbag. In the perfect world, every time his name gets mentioned there'd be a voiceover "Noted scumbag Jack Dorsey says....".
posted by aramaic at 9:24 PM on August 19, 2018 [8 favorites]


The thing with the porn analogy is, dear god furry twitter. I don't ask to see this stuff, I don't find much of it interesting. I don't find it offensive, but jeebus, it is just THERE so much.

This is mostly a result of twitter's decision to show the likes of who one is following. If they were to remove that one thing, showing the likes of people in the "following" category, they could cut back on not only the unwanted furry porn but probably also a lot of the bullshit that ends up getting spread around such as (actual) Fake News and other shit that people aren't brain filtering properly.
posted by hippybear at 9:25 PM on August 19, 2018 [4 favorites]


Big Tech must be completely dysfunctional and in a panic right now.

Having worked in Silicon Valley for a very long time, I doubt that Big Tech is panicked at all about this. Big Tech is treating this like it treats security threats, by playing whack-a-mole. Twitter is not going to give up it's biggest advertiser, Orange #45, by doing anything of consequence against his conspiracy buddies and good ol' white boys. Jack Dorsey is in their camp. It's part of the technocratic libertarianism that is more prevalent than people realize...
posted by Chuffy at 9:28 PM on August 19, 2018 [10 favorites]


(I mean, the furry porn would still exist, but it wouldn't be in my timeline when I'm not seeking it out. Likewise, fake news would be greatly reduced in feeds (retweets still exist) if it weren't being sought out if likes were cut out.)
posted by hippybear at 9:28 PM on August 19, 2018


Lists still seem to show me all the people I follow in order with nothing else, but I imagine really publicizing that would change it quick.
posted by Small Dollar at 9:43 PM on August 19, 2018 [4 favorites]


Censorship has always been the coward's tool for keeping out inconvenient realities to fit a deceptive narrative.

If that's always the case, then doing it for any other purpose must not be censorship, correct?

So, for example, if someone was accusing Sandy Hook parents of faking their children's deaths, and that someone were removed from a platform, it would only be censorship if that accusation was an "inconvenient reality?" If, on the other hand, that accusation was completely fraudulent, their removal would not be censorship? That seems to be the logical conclusion of what you're saying.
posted by RobotHero at 9:45 PM on August 19, 2018 [50 favorites]


with power comes responsibility yadda yadda yadda. But seriously. That is where we are. The big social media companies have a shit ton of power right now, but seem to utterly lack a plan for seeing that it's used responsibly. The argument that these companies have become effectively utilities and should be regulated as such strikes me as worth exploring, except I have to wonder, who exactly does the regulating?
posted by philip-random at 9:56 PM on August 19, 2018 [3 favorites]


 I would rather have a Neo-Nazi tell me to my face what he thinks about me than sweep it under the rug and pretend he doesn't have a knife to my back

I preferred it when Nazis were shamed and/or censored off every public platform. When they were denied opportunities to recruit, there were fewer knives at my back.
posted by justsomebodythatyouusedtoknow at 9:59 PM on August 19, 2018 [112 favorites]


philip-random: it would be the FCC, except not this FCC because this is not the FCC that reflects actual US values (which government should). It is the FCC that reflects populist rhetorical values, and so we'll have to wait a while. Sadly.
posted by hippybear at 10:04 PM on August 19, 2018 [2 favorites]


If, on the other hand, that accusation was completely fraudulent, their removal would not be censorship? That seems to be the logical conclusion of what you're saying.

I find the news stories denying the reality of global warming to be the equivalent of shouting fire in a crowded theater, or, rather, shouting that a theater is not on fire when it is. The issue is a little more complex than that, but not much.
posted by xammerboy at 10:05 PM on August 19, 2018 [16 favorites]


Censorship is the government saying you can't say a thing. A corporation saying you can't say a thing is a denial of a soapbox, but is not censorship.

If/when we regulate Faceboot and others like we do television, that would be censorship. This is not that.

The conversation about the role of the nation-state vs the role of the giant multinational corporation in who writes the rules is one which is only just beginning. #ShadesOfRollerball
posted by hippybear at 10:09 PM on August 19, 2018 [4 favorites]


Who exactly does the regulating?

The fairness doctrine of the United States Federal Communications Commission (FCC), introduced in 1949, was a policy that required the holders of broadcast licenses both to present controversial issues of public importance and to do so in a manner that was—in the FCC's view—honest, equitable, and balanced. The FCC eliminated the policy in 1987 and removed the rule that implemented the policy from the Federal Register in August 2011. = Wikipedia

Now, the fairness doctrine was pretty weak tea, but it was something. If the FCC found you weren't being honest about an issue of public importance, they could fine you. That was enough for most media outlets to make a good faith effort.
posted by xammerboy at 10:09 PM on August 19, 2018 [15 favorites]


This is mostly a result of twitter's decision to show the likes of who one is following. If they were to remove that one thing, showing the likes of people in the "following" category, they could cut back on not only the unwanted furry porn but probably also a lot of the bullshit that ends up getting spread around such as (actual) Fake News and other shit that people aren't brain filtering properly.

I agree 100%. This would also solve the problem of strangers getting "in your mentions" because your posts are being broadcast who knows where without anyone actively retweeting them.

People seem to tend to "like" simple, strident political statements they agree with as a show of support, which is cool. People posting those things on Twitter is also fine, though I probably won't follow people who do too much of it. But sometimes I open Twitter and basically just see a mess of strangers yelling political slogans that I mostly agree with, then themselves getting yelled at by random trolls and hairsplitters.

I don't have any animosity toward people for posting political truisms. Some of them are activists doing their jobs, knowing the news media grab quotes from Twitter. Some of them are just frustrated people looking for a place to vent, and that's cool too. But I don't want it in my feed unless I decide to seek it out or someone I follow decides it's worth posting or retweeting.
posted by smelendez at 10:11 PM on August 19, 2018 [2 favorites]


The fairness doctrine of the United States Federal Communications Commission (FCC), introduced in 1949, was a policy that required the holders of broadcast licenses both to present controversial issues of public importance and to do so in a manner that was—in the FCC's view—honest, equitable, and balanced. The FCC eliminated the policy in 1987 and removed the rule that implemented the policy from the Federal Register in August 2011. = Wikipedia

Now, the fairness doctrine was pretty weak tea, but it was something. If the FCC found you weren't being honest about an issue of public importance, they could fine you. That was enough for most media outlets to make a good faith effort.


Here's the thing: The Fairness Doctrine, as you describe it, scares the shit out of me. Why? Look who's in charge, that's why.

It's far too vulnerable to being a political football. It relies on good faith governance to be effective. Good faith governance is a pipe dream. Probably always has been.

And it's not as if US media is without its own fairness guidelines. The American press is lousy with fairness. And it's partly responsible for shitty state of the union we are all enduring today. When one side is serious, and the other side batshit insane, you don't strive to present balance. In striving to be honest, equitable, and balanced, American news media helped create a monster by presenting vileness, dishonesty and hate as simply one side of the politic, as valid as any other opinion.
posted by 2N2222 at 11:01 PM on August 19, 2018 [8 favorites]


"How should we challenge hate-mongering in the age of social media?"

Punching? Punching is still an option, right?
posted by Ghidorah at 11:19 PM on August 19, 2018 [13 favorites]


I was four-years-old when I told off men like Jones on my own initiative. I kicked people like that in the shins despite grave parental disapproval.
In the five years since Noah Pozner was killed at Sandy Hook Elementary School in Newtown, Conn., death threats and online harassment have forced his parents, Veronique De La Rosa and Leonard Pozner, to relocate seven times. They now live in a high-security community hundreds of miles from where their 6-year-old is buried.

“I would love to go see my son’s grave and I don’t get to do that, but we made the right decision,” Ms. De La Rosa said in a recent interview. Each time they have moved, online fabulists stalking the family have published their whereabouts.
(from https://www.nytimes.com/2018/07/31/us/politics/alex-jones-defamation-suit-sandy-hook.html)

Whose shins would you propose this family kick? And are you under the impression that without a figure like Jones operating on a platform that gives him a global reach, this family would still have been forced to flee the community where their child was buried?
posted by praemunire at 11:40 PM on August 19, 2018 [76 favorites]


The Twitter building in San Francisco was host to a Republican election night (victory) party in 2016. Let’s not take it as a foregone conclusion that if the company started engaging in censorship, it would be of the flavor expected by those on the left. Sticking to a clear set of rules makes it a stable and predictable platform for ALL.

I have yet to look in depth at Twitter’s rules and enforcement, but assuming those two things are being managed reasonably, we have a platform where discourse between different points of view can occur, and people’s minds can be slowly changed. Rather than funneling people into separate echo chambers where they are constantly under scrutiny to make sure they are displaying enough in-group behavior, and thus they don’t get a forum to work out some of their more individualistic thoughts. This discourse across differing viewpoints is a huge amount of work, but luckily there are muting options for when it’s just not worth it. When it does happen, seeds get sown in people’s minds, even if they react negatively at the time.

As long as we’re teaching kids critical thinking skills, and people are given tools to be able to protect themselves, I’m ok with leaning towards low censorship online. Social media is too decentralized for authoritarian censorship to work well. Metafilter is an example of just how much work a moderated approach takes, and that’s for a small self-selected group of people speaking a single language.

Bots are a different matter, because they can descend in arbitrarily large packs, and there is no mind behind them that could change with exposure. I don’t think bots have much place on social media, especially for platforms that want to avoid becoming graveyards (like my email inbox, full of unread junk mail of vague interest to me).
posted by mantecol at 12:03 AM on August 20, 2018 [3 favorites]


I have yet to look in depth at Twitter’s rules and enforcement, but assuming those two things are being managed reasonably

That's... quite the unwarranted assumption. Like, I could go over to the Megathread and make the claim "I have yet to look into the current administration's immigration policy in depth, but assuming refugees are being treated humanely...", and it'd probably go about as well.

As long as we’re teaching kids critical thinking skills, and people are given tools to be able to protect themselves
While we're assuming other things that aren't happening, can we assume miniature shock collars that infallibly zap someone when they send off a death threat?

Like, I get the urge to wade in, but when you're making that many "I haven't actually looked into what's going on, but I'm going to assume this theoretical basis and decry any reaction more intense than my own as authoritarian censorship" statements, it's a good spot to catch yourself and go "Am I responding to things from a well-founded basis here?"
posted by CrystalDave at 12:19 AM on August 20, 2018 [41 favorites]


Jones has about as much right to lie and threaten innocent victims of mass atrocities on Twitter as he does to waltz into your living room and scream at you about how your dead grandparents were actually serial rapists. E.g., none. This isn't censorship. He still has the right to lie and lie, he just doesn't have the right to do it on private property. Shutting his stupid face up doesn't make him stronger. The only thing that makes him stronger is happily providing a bigger megaphone for his poison.
posted by 1adam12 at 12:52 AM on August 20, 2018 [18 favorites]


As a non-twitter user, I had no idea about the unwanted furry porn thing. If an app on my phone showed me furry porn at random I'd delete it pronto! Why would anyone keep using it?
posted by adept256 at 1:07 AM on August 20, 2018


Day 12 of not having a twitter account. Tacos and whiskey remain delicious, and maybe even taste a little better.

And here's the kicker, this post right here reminded me who Alex Jones was. I legit forgot he existed! And I'm going to go to bed and probably forget about him all over again.

This can be your life!
posted by East14thTaco at 1:56 AM on August 20, 2018 [16 favorites]


I don't think Twitter/Facebook/Googleplus/Reddit/etc are public spaces. They're highly centralized online loci where implicit rules and institutions (which turn out to be highly conductive to mass propaganda, hatemongering, cyberbullying, and general shittiness) are policed by the megacorporations. They're private services paid by your free labour and data, where you not only are the product, but also the consumer of the product, "like a hot dog putting ketchup on itself."

(Edit: CW: Rob Horning)

(The linked article is well worth reading IMO. Takeaway points: online identities are commodity, perhaps the kind the closest to Marx's formulation in human history. The corporate-owned mainstream social media are, ideologically, authenticity machines. They seduce us in such a way so that we keep on optimizing ourselves the self-commodity, making ourselves ever more liquid, ever easier to be consumed, ever more complicit in the inner workings of the machinery. The article was mainly about "authenticity", but it's the commodity picture I find useful, and it's interesting how the internal logic of social media makes it an impulse-amplifier, the best thing for hatemongering. They're definitely not a public space. They're conveyor belts in the factories owned by Google, Twitter, Facebook, the oligarchies. We're the product, the consumer, the unpaid worker, and the lubrication oil in the gears.)
posted by runcifex at 4:03 AM on August 20, 2018 [12 favorites]


I can't forget about Alex Jones because I first came across him in the late 90s when he was still a fringe wacko conspiracy theorist mainly found on incredibly shitty AM stations and he was most famous for his ranting about Bohemian Grove and Bush the Elder's ties to Satanism and the Bilderbergers through said ritual gathering.

Back then, the vast majority of the people who knew about him thought of him as completely off his rocker and listened for the laughs. I think that his influence is let change in society and more a combination of him moderating somewhat and the legitimization of people like Limbaugh, Hannity, and Beck, who appear superficially to be more grounded in reality (and more so 10-15 years ago) falling into believing their collective persecution fantasy and slowly becoming more and more unhinged, acclimating people to the muddy thinking and ridiculous leaps of "logic" that prime them to accept the complete schizoid hallucination that is Jones' professed reality.

Like all good cult leaders, he has convinced his otherwise healthy followers to skip on down the path to madness right along with them despite not laboring under the same underlying ailment themselves. With every passing day the skill set necessary to combat the disease shifts farther and farther away from that our political leaders who still inhabit our shared reality possess. Inspiring oration and old-fashioned politicking and even grass roots organizing aren't useful here.

What is needed to combat the reactionary right wing that is rapidly dissociating itself from the few tenuous threads keeping it connected with reality are the techniques of cult deprogramming, not the usual social and political tools. Pizzagate alone was proof enough of how far their disease has progressed. From the outside it wasn't obvious they were quite that far gone and sadly very few fully realized the implications at the time.

For far too long cynical people have been funding this insanity because it's good for business if you cater to goldbugs, preppers, and gun nuts. When you understand that they literally believe that all kinds of disasters and literal massacres are happening, widespread confiscation of guns is underway, massive prison camps are being built,in and all the other shit Jones and his money grubbing hangers on are pushing it becomes a lot easier to understand why the people who have been sucked into this cult are acting the way they are.

It will only get worse as long as the people who fund the noise continue to push the beneficiaries of their largesse further and further off the deep end to ensure that people continue to feel like they are in imminent danger of their worst nightmares coming true and keep buying what the advertisers are selling. Social media has, of course, expanded the reach of their messaging and thus the size of the cult immensely. On the bright side, there's no reason to think that people are any more susceptible to the bullshit, so there is some upper limit to their growth. Unfortunately, that number is very large even if the percentage remains small thanks to the Internet's wide reach.
posted by wierdo at 4:12 AM on August 20, 2018 [11 favorites]


People tend to think of social media as either a kind of digital public square where differing opinions are shared and discussed, or a digital private clubhouse where people of like mind can collaborate. Both of these analogies fall short by missing the fact that the venue itself is a party to the interaction, and is attempting to extract profit from it. They do this by manipulating the users to stay online longer, to come back more often, and to keep scrolling down. This is more insidious than content moderation or even censorship. Twitter is saying that it is okay to manipulate the information you receive, to manipulate your political beliefs and your worldview, but only in the service of profit, and not social good.
posted by Nothing at 4:15 AM on August 20, 2018 [22 favorites]


Oh, good lord, the hand-wringing over poorly- or even incorrectly-named "censorship" is tiresome these days. And the argument that banning these fascists only makes them stronger and how targets of abuse and harassment and doxxing need to engage with them lest they somehow gain power keeps on getting proven wrong again and again:

You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech (PDF):
Looking at the causal effects of the ban on both participating users and affected communities, we found that the ban served a number of useful purposes for Reddit. Users participating in the banned subreddits either left the site or (for those who remained) dramatically reduced their hate speech usage. Communities that inherited the displaced activity of these users did not suffer from an increase in hate speech. While the philosophical issues surrounding moderation (and banning specifically) are complex, the present work seeks to inform the discussion with results on the efficacy of banning deviant hate groups from internet platforms.
Sarah Manavis: No-platforming on Twitter can work – and Baked Alaska proves it
Immediately after his banning, searches for Gionet’s name drastically dropped, despite 18-months of growth thanks to Pepe, Trump, and Charlottesville. His YouTube channel, despite his best efforts, failed to take off – at least when compared to the level of his alt-right counterparts (such as Alex Jones himself.) He was even banned from live-streaming on the video hosting site for hate speech. And, worst of all for Gionet, news outlets stopped writing about him. That was, aside from occasionally checking-in to mock him for how badly his career post-Twitter banning was going.

The truth was that, after being no-platformed by Twitter, Baked Alaska became irrelevant. And not just irrelevant in the mainstream, but an increasingly irrelevant voice even in the niches of political discourse.

Nor is Gionet an isolated case. Milo Yiannopoulos, although admittedly more famous than Baked Alaska ever was, is undeniably a diminished figure since being removed from Twitter on 15 November 2017. There’s been little news about him since his now infamous editor’s notes were published slamming his then cancelled book back at the end of 2017, aside from being mobbed out of a bar back in April. Similar things have happened to white nationalist Richard Spencer, who was famously punched in the head at Trump’s inauguration. Although Spencer is still on Twitter, he had his verification stripped, and has since been disowned by mainstream news sources as well as his alt-right counterparts.
Geraldine DeRuiter: What Happened When I Tried Talking to Twitter Abusers
There’s a lot of discussion about how we need to reach out and talk to people who disagree with us – how we need to extend an olive branch and find common ground – and that’s a lovely sentiment, but in order for that to work, the other party needs to be … well, not a raging asshole. Insisting that people continue to reach out to their abusers in hopes that they will change suggests that the abuse is somehow in the victim’s hands to control. This puts a ridiculously unfair onus on marginalized groups – in particular, women of color, who are the group most likely to be harassed online. (For more on this topic, read about how Ijeoma Oluo spent a day replying to the racists in her feed with MLK quotes – and after enduring hideous insults and threats, she finally got exactly one apology from a 14-year-old kid. People later pointed to the exercise as proof that victims of racism just need to try harder to get white people to like them. Which is some serious bullshit.)

I spent days trying to talk to the people in my mentions who insulted and attacked me. I’d have been better off just remembering that when someone shows you who they are, believe them the first tweet.
posted by zombieflanders at 4:17 AM on August 20, 2018 [58 favorites]


sometimes I open Twitter and basically just see a mess of strangers yelling political slogans that I mostly agree with

I ended up leaving all social media because I found the behavior of the people I tend to agree with distressing. Social media is a machine whose purpose is to turn educated, kind, sane people into anything on the spectrum between "mild dick" to "raging self-important asshole" for the profit of big tech companies. That anyone has been able to use it for anything else has been sheer good luck but luck runs out. And I have better things to do than watch the Left tear itself to shreds in a public forum.
posted by eustacescrubb at 4:39 AM on August 20, 2018 [9 favorites]


Some of the arguments from the articles in the FPP are hilariously poorly thought through.

"If Alex Jones worked as hard on his accounts as I have on mine, I daresay he's earned the right to continue using his plot of virtual real estate."

Mm-hmm. And if a renter paints some walls and mows the lawn, he now has an inalienable right never to be evicted or have a contract renewal refused. Street artists tagging subway trains ought to be able to take them out of service for gallery display, never mind obviously having a right to not have their artwork destroyed, or the train cars ever decommissioned (they have earned the right to keep enjoying that piece of mobile real estate).

Bullshit.
posted by Dysk at 4:46 AM on August 20, 2018 [23 favorites]


Big Tech can ban these people, and they will create another venue, and the act of organizing will make that sect stronger, not weaker. You are not going to make them go away, shut up, change their minds, or break to your will.

The act of organising will make them smaller, not bigger. They lose access to the network effects that sustain Twitter and Facebook in the first place, and which is what currently allows those communities to grow. Shunt them off onto their own forums. Stormfront has existed for a long time now, and was never in danger of recruiting a meaningful plurality of people, like nazis and white supremacists on Twitter and Facebook have managed in a few short years. There, a hard core will indeed remain, but their wider support will wither and die - or rather, it will never follow or join them in the first place, if they aren't on Twitter and Facebook.

We don't need to make them shut up or go away. We just need to stop handing them megaphones, stop giving them TV spots, stop handing them the spotlight. Nobody has an inherent claim or right to the spotlight. Just stop actively helping them. That will be enough.
posted by Dysk at 5:00 AM on August 20, 2018 [36 favorites]


Big Tech can ban these people, and they will create another venue, and the act of organizing will make that sect stronger, not weaker. You are not going to make them go away, shut up, change their minds, or break to your will.

Creating their own little nazi venue is great! The entire point here is to deny them access to the mainstream. The stated strategy of several white supremacist/nationalist/aw hell just say nazi leaders is to play it cool, "hide their power level," and slowly edge people further and further right until they're ready to join up or at least tolerate the full breadth of nazi ideas.

On Facebook and other mainstream platforms, their messages and propaganda can get out to everyone because most people believe that "Facebook/Twitter/Reddit is a place for everyone." On their own little site, the propaganda only gets to people who sign up for an account and regular people won't sign up for an account because "that's a place for nazis and I'm not a nazi."

We can argue over where exactly to draw the line for who gets to be mainstream and who doesn't. I'm pretty comfortable with putting Alex Jones on the "doesn't" side.
posted by Grimp0teuthis at 5:44 AM on August 20, 2018 [16 favorites]


I can't help but think that asking for fash removal (welcome as it would be) on Twitter is committing the error of letting @jack decide what is acceptable. That has already failed. He has already failed. He has no idea what dangerous threats look like. An idiot can figure out how to game the flagging system, and they frequently do. @jack is too rich and too stupid to care.

Short term, yeah, ban fash from FB and Twitter. Post-Trump, find a way to remove that responsibility from tech bros permanently. They (we!) can't and shouldn't be trusted. Basically, not only the workers, but also the products need to control the means of production.

Fediverse is getting there in terms of technology, but it still needs a repeatable governance model for instances.
posted by Wrinkled Stumpskin at 5:47 AM on August 20, 2018 [2 favorites]


Furthermore Facebook and Youtube and all them... they aren't passive forums for debate, they are actively designing their sites to drive engagement, to foreground things and suppress others. They are paying tons of money to hire some really skilled people to develop algorithms that will direct their users to what they want them to.

So these companies are active participants in this process. They are responsible for spreading this stuff. When youtube's search puts white supremacist propaganda above the song I was actually looking for, they are responsible.

And I think one of the biggest dangers of machine learning/AI is not what it can do itself, but that it lets people disclaim responsibility by gesturing towards an algorithm and claiming it was decided by an objective mathematical process. Well they're not. They are a black box process, which is not necessarily a bad thing in and of itself, but it does mean the only thing you can judge it by is the output. Looking at the output is how the engineers designed it. They fiddled with things and changed the math and parameters until the output looked "right." So if Facebook or YouTube or Twitter is driving engagement to these people at all... that's the result of drcisions by them that deemed that content acceptable and "right."
posted by Zalzidrax at 5:53 AM on August 20, 2018 [9 favorites]


These social media sites are already practicing moderation of threats and harassment. A person like me can't get away with inciting violence against my political opposition, much less federal prosecutors. But they have liberally bent the rules for their most popular members, on the grounds that if you pass a fuzzy threshold of followers it's "political speech" or "entertainment."

Tolerance of harassment becomes a form of censorship, forcing people who can't do don't want to do the work to handle massive slush-piles of personal attacks and threats off of those systems.
posted by GenderNullPointerException at 6:01 AM on August 20, 2018 [7 favorites]


The fact that this debate exists at all is a sign, to me, of how far the Internet has fallen from its original open web ideals to a domain of loosely-connected captive portals.

Access to the Internet itself should probably be considered a human right, and that includes the ability to host your shitty neo-Nazi website from a server running in your closet. But, as Dysk says, open access to the public sphere does not extend to your right to be handed a megaphone and a captive audience, which is what algorithm-driven social media has done. There's a huge amount of space between "can't use the Internet at all" and "the corporate giant that controls what half the eyeballs on the planet get to see must continue to promote my content to those eyeballs".

Just to set out the stakes on my personal position: TLD registrars should probably be required to give out DNS names regardless of content. I'm ambivalent about whether search providers should be require to index content they don't approve of. But media platforms should not be required to host hate speech.
posted by tobascodagama at 7:51 AM on August 20, 2018 [13 favorites]


So these companies are active participants in this process. They are responsible for spreading this stuff. When youtube's search puts white supremacist propaganda above the song I was actually looking for, they are responsible.

YouTube's algorithm has found that people who are interested in a subject tend to click on follow up suggestions that present edgier material. Watch a YouTube video on dieting, and pretty soon you'll be served videos for the Paleo diet. Watch the video "What do Muslims believe?" and pretty soon you'll be served up ISIS recruitment videos. A lot of kids, use YouTube like many older people use Google, that is, to answer basic questions like "Why is the sky blue?" etc. This presentation format creates a parity between highly dubious content, like the 9/11 conspiracies, and basic factual information.
posted by xammerboy at 8:00 AM on August 20, 2018 [4 favorites]


Watch the video "What do Muslims believe?" and pretty soon you'll be served up ISIS recruitment videos.

It's far more likely that you'll be served up a bunch of right-wing "Muslims are animals destroying white culture and this is why we must eliminate them from the planet" hate-speech videos than any ISIS propaganda.
posted by zombieflanders at 8:04 AM on August 20, 2018 [12 favorites]


Here's the thing: The Fairness Doctrine, as you describe it, scares the shit out of me. Why? Look who's in charge, that's why. It's far too vulnerable to being a political football. It relies on good faith governance to be effective. Good faith governance is a pipe dream. Probably always has been.

Here's the thing though, I remember when the FCC fined media for fairness reasons, and, while problematic, the state of affairs in regards to partisan news and outright fabrication presenting itself as fact has exploded since. The internet is a cesspool of partisan based viewpoints and misinformation. If part of the dream of the internet was that it was going to educate people and make them better informed, then it's failed - miserably.
posted by xammerboy at 8:08 AM on August 20, 2018 [8 favorites]


Whose shins would you propose this family kick? And are you under the impression that without a figure like Jones operating on a platform that gives him a global reach, this family would still have been forced to flee the community where their child was buried?

Ultimately, the problem of the actual threat of the fash (and other crazies) in AD2018 is the problem of a failure of the rule of law, and enforcement of that law. We are not going to get any relief, ultimately, until we solve the structural problems that allow these people to flourish with impunity.

How many death threats, rape threats, etc, get released across social media every day? Now how many of those are taken seriously, how many of those get police involvement? If they do get police involvement, how likely are they to be taken seriously and an actual case opened and real investigative time spent? People claim it's an issue where the police don't have the technical skills to track this stuff - but I guarantee you if it was a murder of a 'prominent' citizen, they would somehow come up with them. The things these people do are already against the law, but these laws are not being enforced.

When a family flees their community like that, it is because they believe, on a fundamental level, that the law cannot and will not protect them. That is, at its heart, a breakdown of the rule of law. And if they're correct - if the law can't and won't- then who is it for and what is it doing?
posted by corb at 8:10 AM on August 20, 2018 [13 favorites]


A large part of it IMNSHO is that the very design principles on which mass social media are based are completely wrong. The global panopticon really doesn't help you get your voice out, since you're one in a million. It does make you searchable for anyone who wants to pick a pick a fight about your job, your politics, your interests, or your various cultural identities. And those fights are now a public referendum where people compete for the wittiest ways to play "gotcha" for likes and reblogs.
posted by GenderNullPointerException at 8:23 AM on August 20, 2018 [6 favorites]


How many death threats, rape threats, etc, get released across social media every day? Now how many of those are taken seriously, how many of those get police involvement? If they do get police involvement, how likely are they to be taken seriously and an actual case opened and real investigative time spent? People claim it's an issue where the police don't have the technical skills to track this stuff - but I guarantee you if it was a murder of a 'prominent' citizen, they would somehow come up with them. The things these people do are already against the law, but these laws are not being enforced.

I really feel strongly that police shouldn't be spending any extra time investigating internet threats on social media unless they're really, genuinely actionable. Social media bullying and threats are a social phenomena that isn't going to go away by criminalizing it. To get a criminal conviction, you need to prove 'beyond a reasonable doubt' that a person committed a crime. This puts police and prosecutors in a difficult position because, honestly, how do you dedicate the resources to do that in the case of social media threats? People claim that the police don't have the technical skills to track this stuff because on average it's true - Network Forensics is a very specialized field.

When a family flees their community like that, it is because they believe, on a fundamental level, that the law cannot and will not protect them. That is, at its heart, a breakdown of the rule of law. And if they're correct - if the law can't and won't- then who is it for and what is it doing?

This is why Twitter should grow a backbone and throw Alex Jones and other conspiracy theorists off their platform. Fundamentally, Alex Jones has a right to speak his abhorrent views (within some boundaries) but Twitter doesn't have any requirement to help spread them. Baked Alaska and Milo are two excellent examples of prominent right wing nuts who's visibility dropped to zero when Twitter banned them, and I really believe the same thing would happen to Jones if they dropped the hammer on him.

But they need all the eyeballs they can get to make their shareholders happy, so they'll never do it. The sooner Twitter implodes the better.
posted by Fidel Cashflow at 8:42 AM on August 20, 2018


Processing that out a bit more: The assumption that "I'm interested in finding similar interests" drives more activity than "I'm interested in picking a fight" seems to be demonstrably false as a general design principle for the internet, and was long before the current generation of mass social media exploded. It's not the case that assholes outnumber doves, but, given a commons, assholes will take much more of their share absent moderation or structural rate-limiting tools.
posted by GenderNullPointerException at 8:42 AM on August 20, 2018 [4 favorites]


Re: "The Internet is Not a Public Park" I've heard first-hand stories about how public parks in the midwest have historically been (and some still are) unsafe for black families due to harassment and threats from white patrons. Tolerating harassment is an easy way for people in power to say "not me" when it comes to discrimination.
posted by GenderNullPointerException at 9:03 AM on August 20, 2018 [6 favorites]


Big Tech can ban these people, and they will create another venue, and the act of organizing will make that sect stronger, not weaker.

Theoretically, the most active and vicious members of the Nazi-esque crowd could get thrown off Twitter, and gather their skills and make Their Very Own Nazi Place Online, and own their servers so no advertisers or social pressure could pester them. They could even use open-source tools to make TVONPO.

But I don't see a Nazi version of AO3 showing up anytime soon - because they are, for the most part, selfish, lazy people who can't cooperate toward a goal if it requires more than shouting. (They're too angry at the idea of the "undeserving" benefiting from their work, to cooperate toward a goal where the benefits are shared.) They want the inertia of shared hate to reshape the country without having to do the work of building a community and maintaining the infrastructure that allows it to grow. They don't want to make a place for their message and those who are interested in it; they want a megaphone to blast their message at the rest of us.

If they're shut out of Youtube and Facebook and Twitter, blocked from distributing content on Apple and Spotify, can't register Wordpress sites for their content - they'll shriek about oppression; they won't get together and make themselves a website. They don't want a community of their own; they want a privileged spot in our communities.

Banning them doesn't make them stronger because they don't actually have a "them;" it just looks like it because we have echo-chamber social platforms.
posted by ErisLordFreedom at 9:03 AM on August 20, 2018 [12 favorites]


"Power, money, persuasion, supplication, persecution--these can lift at a colossal humbug--push it a little--weaken it a little, century by century, but only laughter can blow it to rags and atoms at a blast. Against the assault of laughter nothing can stand."
Mark Twain

And this applies not just to Jones. If we were able to just laugh and ridicule Trump -- and Trump absolutely is laughable, and Trump absolutely is ridiculous -- it would send him into cardiac arrest inside a week.
EDIT: This is proving to be the case with religion, also.
posted by dancestoblue at 9:28 AM on August 20, 2018 [1 favorite]


The slippery slope argument against deplatforming is usually either naive or baldly, deeply, cynically self-serving. The quote from Reason (heh) in the OP is the latter:
"It's implausible to imagine a future in which liberal activists don't demand that right-of-center groups be de-platformed."
There the slippery slope argument has the side effect of aligning supposedly Reason-able conservatives with the worst of their number, because of the imagined future in which even there is no room for, uh, reason, should the slope be as slippery as claimed. The argument is that drawing the line anywhere is equivalent to outlawing speech, but there's a whole lot of room between those two that gets compressed in the service of that argument. If I were a Reason reader, I'd be a little alarmed to find them aligning me with people like Jones, but then that whole sort of alignment is part of what belatedly bounced me out of the GOP a few years ago.

The naive liberal (or libertarian) version of the argument is some version of "well, this is bad, but if they ban this speech aren't they going to come for us next?" But again, deplatforming on social media isn't the same as banning speech, and liberal speech hasn't actually been banned (yet).

But speaking of being baldly, cynically self-serving, here's @jack:
Accounts like Jones' can often sensationalize issues and spread unsubstantiated rumors, so it’s critical journalists document, validate, and refute such information directly so people can form their own opinions. This is what serves the public conversation best.
On that subject, here's Jonathan Swift, writing in 1710:
Besides, as the vilest Writer has his Readers, so the greatest Liar has his Believers; and it often happens, that if a Lie be believ’d only for an Hour, it has done its Work, and there is no farther occasion for it. Falsehood flies, and the Truth comes limping after it; so that when Men come to be undeceiv’d, it is too late; the Jest is over, and the Tale has had its Effect…
(You may have heard a line about the truth putting on its shoes, but Swift's seems to have been the first construction of the idea).

Consider also the asymmetric advantage of bullshit, quoting Julian Sanchez:
The rebuttal, by contrast, may require explaining a whole series of preliminary concepts before it’s really possible to explain why the talking point is wrong. So the setup is “snappy, intuitively appealing argument without obvious problems” vs. “rebuttal I probably don’t have time to read, let alone analyze closely.”
Arguing against deplatforming is, in effect, arguing (perhaps explicitly as @jack did) that the asymmetric relationship between distortion (or malicious fiction, in Jones' case) and fact is acceptable, and even desirable, and I find that idea ridiculous. Fact checking is hard enough. Countering bullshit is harder, and it often encounters people's lack of attention (TLDR, haha) and comes too late to undo the damage that has already been done. The amount of energy required results in fatigue and burnout, and with brigades and abuse that Twitter doesn't do anything effective to stop, I think there's a decreasing pool of people who are willing to put in the effort. And per Swift, that effort might not be worth it anyway.

I don't think Twitter has the right leadership to take a stand (I find their continued lack of action to be a stand, and to be a stand on the wrong side, at that), and as I commented in the open thread I quit last week. It felt overdue.
posted by fedward at 9:31 AM on August 20, 2018 [17 favorites]


Here's the thing: The Fairness Doctrine, as you describe it, scares the shit out of me. Why? Look who's in charge, that's why.

yeah, I guess that was my implication in asking the question. "who exactly does the regulating?"

fash removal (welcome as it would be) on Twitter is committing the error of letting @jack decide what is acceptable. That has already failed. He has already failed. He has no idea what dangerous threats look like. An idiot can figure out how to game the flagging system, and they frequently do. @jack is too rich and too stupid to care.

exactly. And even just taking a random look at my own particular online network of friends, relatives, co-workers, friends-of-friends of the above etc (most of whom I'd describe as at least progressive-friendly) tells me that were it up to us to keep a lid on things via the Fairness Doctrine, we'd inevitably get it wrong (ie: start shutting down voices that we just don't like for whatever reason), because the problem as I see it, isn't that there's been a change in human nature of late, but in how we humans are communicating.

Which again gets back to my initial comment:

The big social media companies have a shit ton of power right now, but seem to utterly lack a plan for seeing that it's used responsibly.

Are they just being reckless and irresponsible in adhering to the profit motive above all other directives (ie: delivering dividends to their shareholders)? Yes. Absolutely. Except it's not just that. It's not just the greed driving the thing. It's the thing itself, the nature of the media in question (the algorithms etc) and what they do when set free to roam. The old genie out of the bottle analogy is strong here. Except it's plural, I think. We set them loose a long time ago but seem unwilling (unable?) to accept that, as magical entities do when set free, they've changed actual reality ...

Which gets us back to good ole Alex Jones. I suppose my first encounter with him came via Waking Life, a movie I very much liked (and probably still do, it's been a while). Way back then (2001, before 9/11), he was just a ranting voice in the wilderness, and not even wrong really, just a little hyperbolic, and who really cared anyway? He had no power. But now it seems that he does. Is he crazier now than then? Probably. Power will do that to a man. But who gave him the power? The genies, I think.

We need to reconcile these fucking genies.
posted by philip-random at 9:46 AM on August 20, 2018


> I think that his influence is less change in society and more a combination of him moderating somewhat and the legitimization of people like Limbaugh, Hannity, and Beck...

And you know who legitimized Limbaugh in the '90s and early 2000s? David Letterman and Jay Leno, whose jokes about Monica Lewinski and Janet Reno's face and "government waste" and cannabis were basically indistinguishable from right wing AM radio.
posted by smelendez at 9:46 AM on August 20, 2018 [4 favorites]


Of course they are media companies. A platform is like AWS ... we built some infrastructure and you can send messages around or whatever you want but it's all encrypted and we really have no clue what you're up to . As soon as you start knowing or caring what goes on, or monetizing it or serving targeted ads = not a platform and you should be on the hook for what goes on .
posted by freecellwizard at 9:52 AM on August 20, 2018 [7 favorites]


If I were a Reason reader, I'd be a little alarmed to find them aligning me with people like Jones

honestly they've always been like that
posted by halation at 9:54 AM on August 20, 2018 [3 favorites]


Reason heavily backed Gamergate. They’ve always been Alt-Right, before it was even named.
posted by Artw at 10:15 AM on August 20, 2018 [3 favorites]


See, I thought the joke I was making was obvious. Oh well.
posted by fedward at 10:25 AM on August 20, 2018 [1 favorite]


I’m a bit saccharine about this topic, because while I believe at the government level having some laws like Germany’s would be good and useful, I also see the right wing in power literally going a step further and erasing histories from public school programs, or redefining history in a way that makes it seem less repugnant, so I’m not sure what safeguards need to be in place to prevent that from happening. Texas is a prime example of this, where some schools get textbooks that define the Civil War as being about state’s rights, and others get books saying it was about slavery. The Civil War was about slavery, that’s the objective reality of it, and the state’s rights argument was merely an excuse about the Southern states having the right to own human beings as property, because they didn’t actually think of them as human beings. Here’s a link to a story about this.
In 2010 the Texas State Board of Education adopted new, more conservative learning standards.

Among the changes — how to teach the cause of the Civil War.

One side of the debate: Republican board member Patricia Hardy said, "States' rights were the real issues behind the Civil War. Slavery was an after issue."

On the other side: Lawrence Allen, a Democrat on the board: "Slavery and states' rights."

Ultimately the state voted to soften slavery's role, among other controversial decisions, and these standards became the outline for publishers to sell books to the Texas market — the second-largest in the country.
At the end of the article you find out that they don’t mention Jim Crow at all. So now, not only are kids’ parents telling them bullshit at home, they’re going to school and having that bullshit fed to them even more.

When I think about conservatives’ claims about censorship, I always route them through my head as a projection of what they specifically want to do to liberals and other people they consider inferior. When conservatives are getting banned off Twitter, or having their Confederate traitor monuments taken down, and they scream about censorship or erasing history, what they’re really saying is that they want to do this to us. And look, they’re already doing that. They’re able to harass and threaten liberal voices and force them off of platforms they actively use, thus erasing their speech. They’re already actively changing school textbooks to teach their tilted side of history so that their concept of unreality, which did not happen, is official.

The fact that we have to depend on @jack to take a stand to ban the shitheads is deeply troubling. We shouldn’t have to sit here and beg and grovel to one asshole to kick off people from a platform that so many people use. It’s ridiculous.

All speech isn’t equal. Hate speech is not free speech. The mechanics of using language to infest the consensus reality with the malignant conservative one should be destroyed. Some of these people are so far down the drain that the real debate here is more of a metaphysical or metaphilosophical one: if you have so many people who believe in a reality that did not happen, how do you extract them from that reality? And how do we prevent their diseased reality from further infecting our own? The newly government sanctioned dissolution of truth is making everything worse day by day. If republicans have their way we may have a full generation of kids growing up without knowing what climate change is, for example.
posted by gucci mane at 10:46 AM on August 20, 2018 [4 favorites]


We are not going to get any relief, ultimately, until we solve the structural problems that allow these people to flourish with impunity.

I think once you reach the point of having to call the cops about death threats there have already been multiple structural failures. Law enforcement attention is a limited resource and, while it might work if all that needed to be policed was an occasional extreme case (and the cops actually did so in good faith), it can't deal with a tsunami.

I recently rewatched the "Black White Supremacist" skit from Chappelle's Show. Noteworthy that in order to attend a white-power gathering to spread and reaffirm their convictions, the characters had to go out of their way to some backwater shack. These days, they can just talk to each other on Twitter. (And the title character makes a point of saying that he's written six books, "but they only published four of 'em.")
posted by praemunire at 10:56 AM on August 20, 2018 [3 favorites]


In the 80’s and 90’s white supremacists typically met up at rallies and gun events. Attention to Randy Weaver was initially from an informant who met him at an Aryan Nations rally, for example. There’s a documentary on Netflix that mentions some of these guys and the types of events and places they hung out at. There’s footage of Aryan compounds in the woods of Idaho and such, and they also talked to each other through newsletters. They seemed like a disparate, barely organized grouping of different white supremacist ideologies, such as Christian Identity sects, Posse Comitatus, sovereign citizens, etc., but were still exceptionally violent and able to pull off terrorist attacks. I’m not an expert in this stuff of course, so correct me if I’m wrong.
posted by gucci mane at 11:09 AM on August 20, 2018


And it's only political in that the right wing has adopted social media harassment and threats as a form of political action. Largely apolitical harassment and threats should get moderated, and often are moderated without controversy.
posted by GenderNullPointerException at 11:32 AM on August 20, 2018 [4 favorites]


With Gamergate being a test run for that which could have very easily be acted upon, but Twitter in particular chose not to, opening the doors.
posted by Artw at 11:40 AM on August 20, 2018 [7 favorites]


The naive liberal (or libertarian) version of the argument is some version of "well, this is bad, but if they ban this speech aren't they going to come for us next?"

I just want to point out here that the actual situation on social media is "first they came for the BLM activists". None of these platforms have one whit of a problem banning political speech.
posted by tobascodagama at 11:45 AM on August 20, 2018 [10 favorites]


(Also, Swift was to some extent paraphrasing Virgil, though placing Truth in direct opposition to Falsehood seems to be a legitimate innovation on his part. In Virgil, Rumor's rapid growth is contrasted with Mercury's journey, which does indeed begin with him putting his shoes on, but they're not working in opposition to one another. Off-topic, but I thought some folks might find it interesting.)
posted by tobascodagama at 11:59 AM on August 20, 2018 [5 favorites]


I just want to point out here that the actual situation on social media is "first they came for the BLM activists". None of these platforms have one whit of a problem banning political speech.

Yeah, I'll be convinced that the "reluctant" Alex Jones defenders are actually just passionate about free speech when they spare a single solitary mention for anything that's been done to BLM, or trans people, or protesters in the literal public square (J20 defendants, come on down!), etc. Until then it's just a reprise of the Bell Curve devotees who aren't really racist, they're just confronting an uncomfortable truth because of their dedication to science!
posted by Holy Zarquon's Singing Fish at 2:45 PM on August 20, 2018 [8 favorites]


"Yeah, I'll be convinced that the "reluctant" Alex Jones defenders are actually just passionate about free speech when they spare a single solitary mention for anything that's been done to BLM, or trans people, or protesters in the literal public square (J20 defendants, come on down!), etc. Until then it's just a reprise of the Bell Curve devotees who aren't really racist, they're just confronting an uncomfortable truth because of their dedication to science!"

I'm so reluctant I haven't even defended him. Fuck Alex Jones. I am wary of the 'public sphere' on the internet being controlled by a tiny handful of companies, all of which have involvement with the government (or more precisely, every government they operate in). When the government censors (in the US), one can go to the courts to try to overturn the censorship; there isn't any recourse when these private companies remove a person from their service.

These companies remove folks from their platform all the time. Sometimes its for clear rules violations that make sense to anybody. Other times it's opaque reasons that aren't articulated by the companies. All of their terms of services are written so they have the discretion to kick anyone off their platforms anytime they want. Often times the folks kicked off these platforms have politics that align with mine (very left).

It seems to me that this is all a symptom of the centralization of the internet; as long as the vast majority of people get their content filtered by small amount of tech companies, they will hold unreasonable levels of power. That power will be used in ways that we (regardless of your political perspective) don't agree with, and there is little recourse when this happens.

Certainly most of the right's outrage on this topic doesn't come from a place of genuine concern for the freedom of speech; they will call any platform that doesn't hug nazi's out for 'silencing the right', but then try to silence professors whose course includes facts that are uncomfortable to them.
posted by el io at 3:10 PM on August 20, 2018


I am wary of the 'public sphere' on the internet being controlled by a tiny handful of companies, all of which have involvement with the government (or more precisely, every government they operate in). When the government censors (in the US), one can go to the courts to try to overturn the censorship; there isn't any recourse when these private companies remove a person from their service.

Completely agree with this. Unfortunately, in the moment we have to deal with what we have. Perhaps this will provide some momentum for change.
posted by praemunire at 3:48 PM on August 20, 2018 [4 favorites]


just taking a random look at my own particular online network of friends, relatives, co-workers, friends-of-friends of the above etc ... tells me that were it up to us to keep a lid on things via the Fairness Doctrine, we'd inevitably get it wrong

We would get it wrong, but as wrong as now? News that is factually correct, balanced, and honest (with as little agenda as possible) is the bare minimum requirement for society sharing the same reality.
posted by xammerboy at 7:11 PM on August 20, 2018 [1 favorite]


So, open acknowledgement; I work for a social media company.

That said:

"8 in 10 Americans agree social media companies should be subject to same rules & regs as newspapers and TV which are responsible for the content they publish."

Taking that way from the larger players, should Metafilter be held to the same rules and regs as newspapers and TV? If someone posts a question to Ask MeFi, should 100% of questions go through a live human moderators before they're allowed to be published?

That might be a boost for content quality... but would destroy the internet.

Consider that Youtube gets like 500 hours of content uploaded every minute. You'd need almost a hundred thousand censors working 8 hour days without breaks for a single person to be watching it all.

The world is big. The rules for newspapers, which have dozens of writers, can't possibly still work exactly the same way as the large websites, which have *billions* of writers.

So what's the middle ground?
posted by talldean at 10:00 PM on August 20, 2018 [1 favorite]


I suppose we're looking at a complaint driven future.
that sounds like fun.
posted by philip-random at 10:22 PM on August 20, 2018 [2 favorites]


So what's the middle ground?

The middle ground is that perhaps we need to rethink the blanket indemification that we've given online services. The reason people say that social media needs to be held to the same standards that we hold other media, even if that doesn't make sense, is because we right now don't hold them to any standard. That's something we need to at least have a discussion about.
posted by NoxAeternum at 3:48 AM on August 21, 2018 [7 favorites]


> We are not going to get any relief, ultimately, until we solve the structural problems that allow these people to flourish with impunity... That is, at its heart, a breakdown of the rule of law. And if they're correct - if the law can't and won't - then who is it for and what is it doing?

Six Theses about Contemporary Populism
posted by kliuless at 5:01 AM on August 21, 2018


Consider that Youtube gets like 500 hours of content uploaded every minute. You'd need almost a hundred thousand censors working 8 hour days without breaks for a single person to be watching it all.

back in the day when I was helping a tech company with its documentation and some sort of social media was one of the projects, I recall a discussion where "policing" came up. The feeling was that not everything that got posted need to be reviewed, but stuff that was getting action would, and this would happen in two basic ways:

- complaints that would trigger an effective review process
- anything reaching a certain threshold of activity would trigger an alert

So yeah, ten or fifty people might see that neo-Nazi porn without the company being aware it existed, but not a hundred, or certainly not a thousand.

I wonder what percentage of all that Youtube content gets in excess of one hundred (or perhaps a thousand) views.
posted by philip-random at 8:50 AM on August 21, 2018 [4 favorites]


Consider that Youtube gets like 500 hours of content uploaded every minute. You'd need almost a hundred thousand censors working 8 hour days without breaks for a single person to be watching it all.

You know, I used to watch pirated TV shows. They were often hosted by places like YouTube and Vimeo and other video hosting services. But around five-ten years ago, the networks started suing the video hosting places for copyright infringement, and suddenly it became massively, massively harder to find that sort of thing - certainly not with an easy search on the open web.

Any time people claim that there's no possible way YouTube or similar sites could police their content, I think of the content they police all the time - copyright infringement stuff - and how responsive they are to that. It makes me feel like it absolutely is not a problem of scope, but a problem of interest.
posted by corb at 8:54 AM on August 21, 2018 [19 favorites]


See also: Nazi bans in areas where banning Nazis is a matter of legal liability, which have as far as I am aware not destroyed the concept of social media discourse in Germany.
posted by Holy Zarquon's Singing Fish at 8:59 AM on August 21, 2018 [8 favorites]


Then there’s the dumb shit about being left leaning, which... where? When???

Put your right shoulder and your right foot against a wall at the same time and lift your left foot. What happens? (Try it if you don't know)

Jack's just saying that going further right isn't possible, or that it would involve illegal acts.
posted by rhizome at 10:26 AM on August 21, 2018 [2 favorites]


Consider that Youtube gets like 500 hours of content uploaded every minute. You'd need almost a hundred thousand censors working 8 hour days without breaks for a single person to be watching it all.

The ability of people to upload 500 hours of content a minute was a conscious choice by YouTube/Google, while at the same time failing to account for the negative consequences of providing such a massive platform with no ramp up or much thought of moderation, until the externalities were exploding in their faces.
posted by krinklyfig at 6:55 PM on August 21, 2018 [3 favorites]


Social Media Companies Aren't Liberal or Conservative - "They're capitalist. And their real biases are against labor costs and controversy."

Zeynep Tufekci: How social media took us from Tahrir Square to Donald Trump - "To understand how digital technologies went from instruments for spreading democracy to weapons for attacking it, you have to look beyond the technologies themselves."

Facebook is rating the trustworthiness of its users on a scale from zero to 1 - "The previously unreported ratings system, which Facebook has developed over the last year, shows that the fight against the gaming of tech systems has evolved to include measuring the credibility of users to help identify malicious actors."

A message to my doomed colleagues in the American media - "Congratulations, US media! You've just covered your first press conference of an authoritarian leader with a massive ego and a deep disdain for your trade and everything you hold dear. We in Russia have been doing it for 12 years now—with a short hiatus when our leader wasn't technically our leader—so quite a few things during Donald Trump's press conference rang a bell... [Falsehood comes faster than you can report it, let alone debunk it.]"
posted by kliuless at 1:30 AM on August 22, 2018 [4 favorites]




The more you ignore Alex Jones the funnier he'll get. This is the paradox of America in 2018.

He is Schrodinger's comedy. He's only funny when someone is trying to look at him, but nobody has any business looking at him. So he should just wind up screaming into the darkness, and that's the funniest thing that could happen. Just being a terrified, panicked little man who we all know is saying something without really caring about the weird racist shit we know he's saying.
posted by East14thTaco at 9:20 AM on August 22, 2018 [1 favorite]


So, this ReCode interview with Senator Ron Wyden about Jones and social media is funny, because Wyden is part of the reason we're in this mess, as illustrated by this statement:
And I said, “I don’t know everything about it,” but I said, “Nobody’s going to invest in social media.” So we came up with an approach, Section 230, that was about creating a shield, so as to not have these early entrepreneurs, you know, clobbered by frivolous stuff. But also a sword to deal with irresponsible conduct. And we can stop there because part of the reason the companies are in so much trouble is that they haven’t been using the sword.
Congratulations, Ron - you've just demonstrated why blanket indemnity is a shitty idea! They didn't use the "sword" because you made it so they didn't have to.
posted by NoxAeternum at 9:31 AM on August 22, 2018 [2 favorites]


90% of people should be banned from using metaphors, I swear to God.
posted by tobascodagama at 9:59 AM on August 22, 2018 [4 favorites]


The more that I think about that interview, less I find it funny, and the more I find it enraging. For over two decades, we have had it pushed on us that the internet needed to have blanket indemnity, because otherwise it would "kill innovation" because the legal risk would be so great that nobody would invest in it. (Looking at what's happening with the recreational marijuana industries in CO, WA, OR, and CA, this is clearly false - you have an industry that is still illegal at the federal level, and yet people are wanting in because they can see the potential.) And even as we've seen all the harms caused because the law indemnified clear bad actors from any sort of liability that might have served to stop them, this message of "blanket indemnification is necessary for the internet to work" kept getting pushed.

And now, we've finally reached a point where it's clear that no, blanket indemnification of the internet is a bad idea that enables bad actors, just like it's been in every other field where we've tried it. And yet Wyden argues that the answer is to get the companies to listen to their better angels and do the moderation that they've never had any desire to do, because people like Wyden made it so they didn't have to.

I don't want any more lectures from Wyden. I want a fucking apology. I want him to acknowledge that he is, in fact, a part of why we're in this fucking mess with his pushing for blanket indemnification, and that he will work to fix things by revising the law to pull back the blanket and actually make these companies accuntable for the harms they enable.
posted by NoxAeternum at 11:38 PM on August 22, 2018 [2 favorites]


And in further evidence disproving the "if you strike Alex Jones down, he will become more powerful than you can imagine" theory, InfoWars is struggling to recapture viewers after their YouTube ban:
Two weeks later, though, the Infowars app is set to slip out of the top 30 news apps, and Infowars is nowhere near replacing its lost YouTube viewership.

Infowars currently hosts its videos on Real.Video, a niche video hosting site that promises that content on the platform is “protected under free speech” and prominently features other channels promoting militias or dubious nutrition ideas. Infowars videos on Real.Video regularly receive only a few hundred or thousand views.

By comparison, Infowars posts on YouTube regularly received at least five figures in terms of viewership. Infowars videos on YouTube earned more than 500,000 views a day on average, according to social-media analytics site SocialBlade, while Infowars’ main YouTube channel received more than 17 million views in the 30 days before its ban.
Deplatforming hate works.
posted by NoxAeternum at 7:31 AM on August 24, 2018 [11 favorites]


Public Attitudes Toward Technology Companies - "Tech companies are the Mainstream Media now, with all the power and responsibility that goes along with that."

also btw on tech's 'political monoculture':
I think most of conservative anxiety over being excluded from respectable debate is due to the fact that the three big pillars of Reagan-era conservatism - Christian conservatism, laissez-faire economics, and muscular interventionism - all failed simultaneously in the 2000s.

The Iraq War debacle, the Great Recession, and the gay marriage defeat, all within the space of a decade, left conservatives with no leg to stand on except good ol' racism.

And racism has, since the 70s, been excluded from the bounds of respectable elite discourse in America.

Conservatives working at Facebook will naturally feel uncomfortable saying America should deregulate the banks, or legislate a traditional definition of marriage, or invade the Middle East - not because these ideas are suppressed, but because in 2018 they just sound goofy.

The real driver of the right, the source of all the dynamism and passion on both the national/political and the intellectual level, is now racial exclusion - the idea that nonwhite immigrants won't adopt American values, that diversity is a threat to economic efficiency, etc.

A conservative working at Facebook will tend to spend a lot of time reading about and discussing racially exclusionary ideas - on the internet, with friends, etc.

But he can't discuss those ideas at work, since explicit racism is still (mostly) out of bounds.

So basically, conservatives at tech companies and other elite intellectual spaces feel under siege. They're reading and talking about all these racist ideas outside of work, but they can't bring it to work. Taboos they wouldn't have minded 10 years ago now bind them constantly.

Of course, your response to this may be "Who cares, racism is bad, deal with it."

And that response would be 100% correct.

I am not super sympathetic to conservatives at this point. Just trying to explain where I think they're coming from, and why.
posted by kliuless at 3:45 AM on August 30, 2018 [5 favorites]


I'm sure everyone will be completely shocked to learn that Jack Dorsey personally overruled Twitter staff to keep Alex Jones on the platform (Gizmodo quoting the WSJ):
Last month, after Twitter’s controversial decision to allow far-right conspiracy theorist Alex Jones to remain on its platform, Mr. Dorsey told one person that he had overruled a decision by his staff to kick Mr. Jones off, according to a person familiar with the discussion. Twitter disputes that account and says Mr. Dorsey wasn’t involved in those discussions.

Twitter’s initial inaction on Mr. Jones, after several other major tech companies banned or limited his content, drew fierce backlash from the public and Twitter’s own employees, some of whom tweeted in protest.

A similar chain of events unfolded in November 2016, when the firm’s trust and safety team kicked alt-right provocateur Richard Spencer off the platform, saying he was operating too many accounts. Mr. Dorsey, who wasn’t involved in the initial discussions, told his team that Mr. Spencer should be allowed to keep one account and stay on the site, according to a person directly involved in the discussions.
Twitter, of course, has denied this in a statement.
posted by fedward at 6:53 AM on September 4, 2018 [1 favorite]


Maybe worthy of an FPP, but I don't want to make one about it, but this thread seems related enough: How autocratic governments use Facebook against their own citizens
posted by tobascodagama at 9:10 AM on September 5, 2018


Gizmodo has a good wrap-up of Jack Dorsey's appearance before the House Energy & Commerce Committee:
Not once did Dorsey, who adopted a notably humble tone throughout his appearances before both the Energy & Commerce Committee and the Senate Intelligence Committee earlier on Wednesday, try to deflect or diminish the blame. Instead, he repeatedly accepted it in full, calmly acknowledging where and when Twitter had fumbled. In contrast to Facebook CEO Mark Zuckerberg, who unmistakably failed to impress Washington with his we-must-do-better approach this year, Dorsey’s admission that Twitter’s problems were immeasurably large and difficult to conquer left sparse real estate for lawmakers to pile on.

Whereas Zuckerberg almost always seemed to be drawing from prepared responses drilled into his head by a room full of high-paid crisis professionals, Dorsey was clearly in command of his own assessments, casually delivering answers to pointed questions, as if he’d come prepared to educate Congress rather than defend himself against it.
And in other news, Pew has some new research about Facebook:
Just over half of Facebook users ages 18 and older (54%) say they have adjusted their privacy settings in the past 12 months, according to a new Pew Research Center survey. Around four-in-ten (42%) say they have taken a break from checking the platform for a period of several weeks or more, while around a quarter (26%) say they have deleted the Facebook app from their cellphone. All told, some 74% of Facebook users say they have taken at least one of these three actions in the past year.
[…]
There are, however, age differences in the share of Facebook users who have recently taken some of these actions. Most notably, 44% of younger users (those ages 18 to 29) say they have deleted the Facebook app from their phone in the past year, nearly four times the share of users ages 65 and older (12%) who have done so. Similarly, older users are much less likely to say they have adjusted their Facebook privacy settings in the past 12 months: Only a third of Facebook users 65 and older have done this, compared with 64% of younger users.
When I deleted the Facebook app from my phone years ago (thus years before the nightmare of the 2016 campaign) I mostly did it because it used a lot of battery power trying to load updates in the background. Little did I know then that I was starting a Facebook diet that eventually would result in me completely deleting my account. I thought I was just preserving battery power on my phone.
posted by fedward at 6:46 AM on September 6, 2018 [2 favorites]




And all of the other Subreddits associated with it, apparently. Bewildered and enraged refugees wander the barren plains of the internets, and Reddit's redoubt of Q mockery, Qult_Headquarters, is a rich broth of schadenfreude and laughter.
posted by jrochest at 7:06 PM on September 13, 2018 [1 favorite]


Banning the related subreddits that users are flocking to is definitely a new move for Reddit, and definitely one in the right direction.
posted by tobascodagama at 9:52 AM on September 14, 2018 [1 favorite]


There is a underlying element of necessity in DMCA's safe harbor provisions. Without it, Metafilter could be held liable for copyright infringement or criminal terroristic threatening due to a user's post, even if it was moderated within a reasonably short period.

There is certainly an argument to be made that the safe harbor clause is broader than it needs to be, but in its complete absence, sites like Metafilter simply couldn't be hosted in the US or run by US nationals without opening themselves up to essentially unlimited liability without draconian and preemptive moderation. It exists because of previous case law threatening to foreclose the possibility of entire classes of communication existing openly, not because of some grand conspiracy to enable people to be shitty to each other without consequences.

Let's make changes, but let's not lose sight of why the thing we want to change exists in the first place.

The marijuana situation isn't a good analog at the moment, although it might be if Sessions eventually gets his way. People are willing to take the risk because the feds have so far been loathe to enforce the law against those following state law with few enough exceptions (and zero so far that target third party investors despite their legal liability) that people are successfully making use of the "I won't be the unlucky one" rationalization that we humans are so good at employing.

On the contrary, in the years leading up to the passage of safe harbor there was a long record of action being taken against BBS operators and others for user posted content even when active steps were being taken to dissuade the activities that got them in trouble.
posted by wierdo at 9:52 PM on September 16, 2018


« Older To read The Disconnect, you have to turn off the...   |   The James Tait Black Memorial Prize Newer »


This thread has been archived and is closed to new comments