Internet-enabled authoritarianism
December 20, 2018 7:07 PM   Subscribe

Machine Politics: The rise of the internet and a new age of authoritarianism. "By justifying the belief that for-profit systems are the best way to improve public life, it has helped turn the expression of individual experience into raw material that can be mined, processed, and sold."
One of the deepest ironies of our current situation is that the modes of communication that enable today’s authoritarians were first dreamed up to defeat them. The same technologies that were meant to level the political playing field have brought troll farms and Russian bots to corrupt our elections. The same platforms of self-expression that we thought would let us empathize with one another and build a more harmonious society have been co-opted by figures such as Milo Yiannopoulos and, for that matter, Donald Trump, to turn white supremacy into a topic of dinner-table conversation. And the same networked methods of organizing that so many thought would bring down malevolent states have not only failed to do so—think of the Arab Spring—but have instead empowered autocrats to more closely monitor protest and dissent.

If we’re going to resist the rise of despotism, we need to understand how this happened and why we didn’t see it coming. We especially need to grapple with the fact that today’s right wing has taken advantage of a decades-long liberal effort to decentralize our media. That effort began at the start of the Second World War, came down to us through the counterculture of the 1960s, and flourishes today in the high-tech hothouse of Silicon Valley. It is animated by a deep faith that when engineering replaces politics, the alienation of mass society and the threat of totalitarianism will melt away. As Trump fumes on Twitter, and Facebook posts are linked to genocide in Myanmar, we are beginning to see just how misplaced that faith has been. Even as they grant us the power to communicate with others around the globe, our social-media networks have spawned a new form of authoritarianism.
posted by homunculus (15 comments total) 34 users marked this as a favorite
 
The author, Stanford Professor Fred Turner, is 'proudly awaiting trolls' in response to his piece. So far it doesn't look like any have accepted his challenge, at least not on Twitter.
posted by homunculus at 7:15 PM on December 20, 2018


Relevant post: (dystopic) utopianism...
posted by homunculus at 7:17 PM on December 20, 2018


Why troll yet another piece in the growing oeuvre of blaming hate on those who were fighting it? The problem is as it has always been - we treat hate as acceptable discourse. It wasn't the technology that caused this - it was the people who run the technology allowing hate into discourse in the name of "free speech".
posted by NoxAeternum at 7:33 PM on December 20, 2018 [20 favorites]


Even further than that, I think that a major PR advantage in the hands of authoritarians is democracy repeatedly making a bad name for itself, usually in a manner integrally related to racism: the American slave empire, Weimar Germany's democratic elevation of the Nazis, U.S. use of nuclear weapons on non-white civilians, and now the American system installing Donald Trump as its leader (on top of everything else, accompanied by an overt threat to make the second wartime use of nuclear weapons in human history again by the U.S., against non-white civilians again.)

Now that many authoritarian societies have achieved some proficiency in attaining and maintaining the economic prosperity previously associated with industrialized democracies, I can imagine democracy not necessarily seeming like such a great alternative when viewed from the outside.

Tangentially related: Singapore's ChannelNewsAsia has an interesting series running right now called Deciphering Indonesia in which a recurring theme is how the ostensibly secular, extremely multi-ethnic, majority Muslim country has been trending towards social and religious conservatism.
posted by XMLicious at 8:07 PM on December 20, 2018




It wasn't the technology that caused this - it was the people who run the technology allowing hate into discourse in the name of "free speech".

The article, pretty specifically, makes the case that putting blame at the feet of technology was the problem - the diagnosis to emphasise individuality and alternative news sources to prevent fascist uprising was because American scholars laid blame for fascism on mass media, not on the corruption of the Weimar Republic.

And the article goes on to say that white supremacists and fascists have worked out how to deal with this, by couching their hate as them speaking their truth. They even cite a video where a woman "comes out" as a conservative! They do it because it works, because it's tremendously effective optics. "Hate" sounds like an angry protest more than it does a man in a suit calmly explaining how Jewish people are trying to drive whites extinct. The latter man sounds like he's just speaking his truth, that the way to counter that speech you don't like is more speech. It elevates individual thought over liberal inclusion, because of a decades-long programme to convince Americans that individual thought would be enough to counter fascism.

It is a good article, and you should read it.

(Liberal inclusion, uh, isn't it either, chief, in the parlance of the times. As Brexit has shown, when it's impractical for fascists to organise, there's still lots of ways they can splinter the current order. There was an argument in the Guardian recently that big spikes in the perception of corruption and drops in the trust in institutions might be one of the warning signs, and while it was annoyingly focused on just the British experience, South Korea and Hungary both had big spikes in their perception of corruption just before their authoritarian takeovers. I think there's something to it.)
posted by Merus at 3:52 AM on December 21, 2018 [12 favorites]


A very useful companion article, illustrating some more of Prof. F. Turner’s research, is E. Zuckerman (Medium, 04.02.16.): Fred Turner: The link from anti-fascist art and the “historical problem” of Facebook (also published here).
posted by progosk at 6:36 AM on December 21, 2018 [1 favorite]


They do it because it works, because it's tremendously effective optics.

It's "tremendously effective optics" because we've created a culture in which espousing hatred will be defended by a chorus of useful idiots singing hosannas to "free speech". We have literally enshrined the defense of a Nazi's "right" to engage in what is fundamentally an act of terrorism as a symbol of how "free" we are as a society.

"Hate" sounds like an angry protest more than it does a man in a suit calmly explaining how Jewish people are trying to drive whites extinct.

This sort of sentiment is the core of the problem, not individual thought. Too often, we give validity to an argument not on its content, but on how it is presented, allowing the most vile arguments to be given credence because they were said in an "appropriate" manner by someone who has the proper "authority". William Buckley was routinely placed in "opposition" to the John Birch Society even though there was little space between their views and his in large part because he was able to portray himself as an intellectual - and was able to legitimize those views in being such.

And yes, I did read the piece, and it came across as almost understanding the problem, but ultimately missing it because the author has a theory that he wants to hold onto despite it not really working. The portion of the left that sees government as totalitarian has always been small (if viewed with outsize influence) - for the most part the goal for much of the left was to reform governmental institutions so they would serve everyone. Furthermore, disillusionment with government structures on the right tends to track with those reforms beginning to be implemented, as they see their position of privilege begin to erode. The part in the article where Turner talks about the communes reflecting cultural mores is the part which was most telling - he almost gets that the problems are cultural, but then turns around and blames the problem on a lack of structure rather than the communes not resisting the larger cultural forces they existed in. There was also his pointing to open source as a potential solution while ignoring the massive problems that open source has with misogyny and exclusion.

Ultimately, the problem is what it has always been - we are unwilling to face our own bigotries head on, and call out hate for what it is.
posted by NoxAeternum at 8:25 AM on December 21, 2018 [1 favorite]


It wasn't the technology that caused this - it was the people who run the technology allowing hate into discourse in the name of "free speech".

The internet doesn't usher in a dystopian hellscape, people usher in a dystopian hellscape.
posted by grumpybear69 at 8:55 AM on December 21, 2018 [1 favorite]


Oh my god the woman "switching sides" to conservative on video isn't the woman I was thinking of. And there was another woman who went from fake lefty to fake centrist (she's dating a white supremacist now, so yeah). Those are just the ones that come right to mind.

Just wanted to say this is a grift trope at this point, and I have not yet rtfa.
posted by Yowser at 9:25 AM on December 21, 2018 [2 favorites]


he almost gets that the problems are cultural, but then turns around and blames the problem on a lack of structure rather than the communes not resisting the larger cultural forces they existed in

I think both of you are both right and wrong - the communes couldn't resist the larger cultural forces they existed in because of that lack of structure, but that lack of structure was a deliberate cultural choice made by the communes. In trying to build less oppressive communities, they built communities with no immune system, and when they got sick, they died.

A lot of the internet communities these hippies made were preyed upon in the exact same manner - Communitree, in the 70s, was a failed precursor to WELL, where the community was expected to police itself and thus had no defences against members of those communities being disruptive or puerile. We saw the same thing, later, at Occupy Wall Street, where the larger culture fought very hard against the movement but what ultimately killed it were loudmouths hijacking the general assemblies and making it unpleasant or uncomfortable for everyone else. I suspect that a lot of the "capitalist" subversion of communist revolutions in the 20th century are also, in truth, a country with no immune system dying from the first attack.

Ultimately, the problem is what it has always been - we are unwilling to face our own bigotries head on, and call out hate for what it is.

The problem with calling out is that it looks more like hate than the thing you're calling out, which is why it's so insidious. Maybe it's my bias towards too much context, but I think the approach ContraPoints or Innuendo Studios takes is much more effective: dig into the reasoning behind it, and expose it, so that no matter which way they twist they can't escape the critique. Also very easy to pair with ridiculous skits so you sound like you know how to party.
posted by Merus at 5:07 PM on December 21, 2018 [2 favorites]


In trying to build less oppressive communities, they built communities with no immune system, and when they got sick, they died.

The thing is that while they may have talked about building less oppressive communities, there was a contingent that was less troubled by the structure of society than by their place in it (a criticism that has also been leveled at Silicon Valley as well.) The lack of "immune system" wasn't accidential - it was there to allow power to be claimed.

The problem with calling out is that it looks more like hate than the thing you're calling out, which is why it's so insidious.

Which, as I stated above, is a root problem (and not just here - one of the things that's been coming out with the spotlight on Larry Nassar was how much he relied on expert power and our societal unwillingness to believe women to cover up his abuse.) We as a society are too quick to use civility as a mark of legitimacy, which is what monsters like Spencer and Miller rely on. Yes, ContraPoints may have found a strategy that works in this environment - but we should not ignore that the strategy exists because we treat civility the way we do.
posted by NoxAeternum at 11:38 AM on December 22, 2018


@kimmaicutler: “The Facebook product team was going to introduce features that would reduce polarization and disinformation but then Joel Kaplan from the Brett Kavanaugh hearing barged in to complain about 'conservative bias.'”
posted by homunculus at 11:15 AM on December 23, 2018




On Facebook's "moderation" practices, M. Fischer (NYT): Inside Facebook’s Secret Rulebook for Global Political Speech:
In the absence of governments or international bodies that can set standards, Facebook is experimenting on its own.
The company never set out to play this role, but in an effort to control problems of its own creation, it has quietly become, with a speed that makes even employees uncomfortable, what is arguably one of the world’s most powerful political regulators. “A lot of this would be a lot easier if there were authoritative third parties that had the answer,” said Brian Fishman, a counterterrorism expert who works with Facebook. “Sometimes these things explode really fast,” Mr. Fishman said, “and we have to figure out what our reaction’s going to be, and we don’t have time for the U.N.”
[…]
Countries where Facebook faces government pressure seem to be better covered than those where it does not. Facebook blocks dozens of far-right groups in Germany, where the authorities scrutinize the social network, but only one in neighboring Austria. The list includes a growing number of groups with one foot in the political mainstream, like the far-right Golden Dawn, which holds seats in the Greek and European Union parliaments. For a tech company to draw these lines is “extremely problematic,” said Jonas Kaiser, a Harvard University expert on online extremism. “It puts social networks in the position to make judgment calls that are traditionally the job of the courts.” [...]
Facebook says moderators are given ample time to review posts and don’t have quotas. Moderators say they face pressure to review about a thousand pieces of content per day. They have eight to 10 seconds for each post, longer for videos. The moderators describe feeling in over their heads. For some, pay is tied to speed and accuracy. Many last only a few exhausting months. Front-line moderators have few mechanisms for alerting Facebook to new threats or holes in the rules — and little incentive to try, one said. […]
But at company headquarters, the most fundamental questions of all remain unanswered: What sorts of content lead directly to violence? When does the platform exacerbate social tensions? Rosa Birch, who leads an internal crisis team, said she and her colleagues had been posing these questions for years. They are making progress, she said, but will probably never have definitive answers. But without a full understanding of the platform’s impact, most policies are just ad hoc responses to problems as they emerge. Employees make a tweak, wait to see what happens, then tweak again — as if repairing an airplane midflight. In the meantime, the company continues to expand its reach to more users in more countries.
posted by progosk at 2:28 AM on December 29, 2018 [1 favorite]


« Older You can never quarantine the past   |   "Did you think that you were deserving of this dog... Newer »


This thread has been archived and is closed to new comments