How Hate Groups Forced Online Platforms to Reveal Their True Nature
August 21, 2017 7:14 AM   Subscribe

What gave these trolls power on platforms wasn’t just their willingness to act in bad faith and to break the rules and norms of their environment. It was their understanding that the rules and norms of platforms were self-serving and cynical in the first place. (SLNYT)
posted by Panthalassa (36 comments total) 26 users marked this as a favorite
 
sadly, twitter revealed itself to be a pro white patriarchy platform a long time ago
posted by entropicamericana at 7:48 AM on August 21 [8 favorites]




Ouroboros time.
posted by bobloblaw at 8:16 AM on August 21


The two problems that I see over and over again are a) a lot of platforms are run by cyberlibertarian types who think banning Nazis is worse than hosting Nazis and b) most customer-facing Silicon Valley companies (and social media firms are almost universally part of that culture) are based on outsourcing the work and expense to third parties. They don't want to moderate their content because that costs money and time they'd rather not spend.

Reddit's probably the worst (aside from the out-and-out bad actors like Gab or 4chan)- their whole moderation model is based on volunteers who are forbidden to receive any compensation whatsoever and their founders and executives were good buddies with folks like ViolentAcrez and actively encouraged the posting of horrible garbage. That combination guarantees virtually free reign for fascist fuckery.
posted by Pope Guilty at 8:33 AM on August 21 [24 favorites]


The "legal talisman" idea suggests the obverse of sovereign citizens.
posted by PMdixon at 8:34 AM on August 21 [1 favorite]


Tangentially related to the subject matter of this post, I just want to take a moment to say "Thank you!" to the mods who help make Metafilter a good and safe online space for discussion on a daily basis.
posted by Lafe at 8:51 AM on August 21 [79 favorites]


This seems to be an argument for complete freedom of expression on these platforms -- or am I reading it wrong? Is the author actually arguing for a more truthful explanation of the purpose of the platforms -- to build certain kinds of regulated communities?

To me, the more egregious exploitation of platforms is that Twitter and Facebook enabled the spread of malicious Russian propaganda on their sites. And that the Trump campaign spent millions in FB ads to spread fake news stories with the goal of discouraging left-leaning voters from going to the polls. In fact, according to this interview with Theresa Wong, who worked on Trump's digital team, Facebook actually sent people to the Trump campaign to help out.

Should platforms willingly abet government attacks on other nations? Should platforms enforce a certain kind of discourse? Should they be on the side of democracy against authoritarianism? What responsibility do these platforms have to civil society, if any? And how do they fulfill this responsibility?
posted by touchstone033 at 8:51 AM on August 21 [6 favorites]


Is the author actually arguing for a more truthful explanation of the purpose of the platforms

In part he's pointing out that their commitment to freedom of speech was always somewhat cynical because it was the cheap and convenient option for the technolibertarians who create these platforms.
posted by fatbird at 9:00 AM on August 21 [10 favorites]


In part he's pointing out that their commitment to freedom of speech was always somewhat cynical because it was the cheap and convenient option for the technolibertarians who create these platforms.

It turns out that free speech absolutism is a really good excuse to do nothing.
posted by NoxAeternum at 9:05 AM on August 21 [12 favorites]


I didn't know that AirBnb had removed people for attending the Charlottesville rally. That one actually does make me a little uncomfortable, since it's essentially a hotel service. Like I totally get why you'd want to prevent hosts from having to rent to horrible racists, but I would be deeply uncomfortable if, say, the Marriott decided to ban protesters for any other event from renting rooms.
The problem with the pseudo-democratic aspects of the internet is I think not just that they're secretly authoritarian, but also that they are very much unregulated by actual democracies, so they're essentially creating a set of private fiefdoms. Which has advantages and disadvantages.
posted by corb at 9:15 AM on August 21 [6 favorites]


I felt this article was written well and leading up to an interesting point and then it just sort of ended without having much of one. I think there are plenty of interesting conversations to have on this subject but don't feel like this one really added much to the discourse.

"Their persecution narrative, which is the most useful narrative they have, and one that will help spread their cause beyond the fringes, was written for them years ago by the same companies that helped give them a voice."

I don't disagree with this but it is hardly a triumph for the these groups. An independent online community platform that allows their hate speech where they can share this "we're being oppressed" narrative is definitely a thing but losing the megaphone and ostensible legitimacy that comes with being hosted by Facebook, reddit and Twitter is on a completely different scale.

How many billion active users do these sites have? They went from shouting through a megaphone in a city centre to whispering in a corner of an abandoned building.

Besides, these people are claiming oppression on any platform that will have them while they are literally embedded in the highest position of power the country has to offer.
posted by slimepuppy at 9:19 AM on August 21 [3 favorites]


The thing is that we don't have to accept false equivalencies. It's entirely okay to treat fascists different from everybody else. It's entirely okay to treat "everybody should be treated well" differently from "Everybody like me should dominate and kill everybody else".
posted by Pope Guilty at 9:21 AM on August 21 [36 favorites]


The statement from the head of CloudFlare was really disgusting, like he felt ashamed of saying "no, we're not going to provide services to white supremacists."
posted by NoxAeternum at 9:29 AM on August 21 [6 favorites]


I didn't know that AirBnb had removed people for attending the Charlottesville rally. That one actually does make me a little uncomfortable, since it's essentially a hotel service. Like I totally get why you'd want to prevent hosts from having to rent to horrible racists, but I would be deeply uncomfortable if, say, the Marriott decided to ban protesters for any other event from renting rooms.

I know where you are coming from, corb, but I am not at all uncomfortable with AirBnb removing people for attending the Charlottesville rally. I think it is acceptable and necessary to make a distinction between neo-Nazi/white supremacist rallies and other protests. No, I would not feel comfortable with hotels banning guests who were attending [most] other protests, but we can and should be comfortable with the generally accepted principle that neo-Nazi rallies are different and should be condemned. No business should be under an obligation to support their activities. I just don't believe there is a slippery slope here.
posted by hurdy gurdy girl at 9:31 AM on August 21 [18 favorites]


he's ashamed of being a modern day robber baron.
posted by Annika Cicada at 9:40 AM on August 21 [3 favorites]


AirBnb already has well-established problems with racism, precisely because of the way it tries to pretend it's not actually a hotel service. (And this applies to Uber as well, of course.) As long as a host isn't stupid enough to make any racist remarks while canceling somebody's reservation, unlike this woman, they'll probably get away with it. Even then, it was evidently the state of California, not AirBnb itself, that took action against the host.

So, the idea that banning Nazis from the service would be a slippery slope kind of ignores the fact that the system is already being abused.
posted by tobascodagama at 9:47 AM on August 21 [18 favorites]


It's perfectly reasonable to be uncomfortable that the organizations that enabled the movement are the ones trying to police it now. The CloudFlare guy is right. He's doing the right thing and he knows it, but he at least recognizes that he's not qualified to be making these types of decisions. This one was easy. Not all of them will be, though, and if I had to come up with a list of types of people who shouldn't be making public policy, Silicon Valley entrepreneurs would be pretty high up there. The fact that the CloudFlare guy has some self-awareness about it makes him more qualified than most.

Sadly, half-assed libertarian solutions are the most effective option right now, barring very specific, very carefully crafted local laws. Passionate rants and catchy tweets and slogans are great in their own contexts, but public policy has to be precise, narrow, and very carefully phrased so its interpretation doesn't depend on people's individual "common sense" bubbles.
posted by ernielundquist at 9:50 AM on August 21 [15 favorites]


The statement from the head of CloudFlare was really disgusting, like he felt ashamed of saying "no, we're not going to provide services to white supremacists."

It read to me more like he's ashamed of the fact that he, a private citizen, has the unlimited and unreviewable power to remove content from the internet, simply because he doesn't like it. I certainly think he made the right call here, but that's a very dangerous situation, and he's right to be uneasy about it.
posted by Zonker at 9:53 AM on August 21 [23 favorites]


(Or, on failure to preview, what ernielundquist said.)
posted by Zonker at 9:58 AM on August 21


simply because he doesn't like it

This here is a large part of the problem when talking about hate speech - that opposition gets framed as a matter of dislike, disgust, or unpopularity (and that last one is especially odious because it's often an out and out lie as well.) This serves to delegitimize opposition to hate speech, because it frames that opposition as being based on feelings and popularity, and not the fact that hate speech attacks people, makes them fear for their safety, and pushes their voices out of the forum.

So yes, we need to push back when that sort of framing is used. And when you use the proper framing, his argument makes much less sense.
posted by NoxAeternum at 10:18 AM on August 21 [29 favorites]


Cloudflare's position is special because they have an extreme content neutrality policy; they provide their services to spammers and denial of service attackers, and they've provided service to ISIS (don't know if they still do).
posted by Monday, stony Monday at 10:18 AM on August 21 [2 favorites]


I have a good friend who rents space in her home on AirBnB. She gets glowing reviews from her guests, who seem to really enjoy being welcomed into her house, sharing home-cooked meals, and meeting her friends and neighbors. My friend is also Jewish. I can't imagine the discomfort and stress for her if her guests turned out to be anti-Semitic white-supremacists who had come to town with trouble in mind.

So.. While I do have some reservations about AirBnB's behavior in many circumstances and don't want to see it become generally acceptable to deny lodging to people based solely on their political opinions, the Charlottesville marchers (a) were not only engaging in political speech, and (b) potentially represented a threat to hosts. AirBnB could have left it up to the hosts to cancel reservations but who would want to volunteer to be singled out when a group of violent Nazis are coming to town? I am not happy about the situation in any way but I think that in this particular case AirBnB acted in a fairly sensible manner.
posted by Nerd of the North at 10:20 AM on August 21 [4 favorites]


Cloudflare also likes to handle complaints by passing them directly, without redaction, to the customer being complained about. Including any and all personal information about the person who filed the complaint.
posted by Pope Guilty at 10:22 AM on August 21 [10 favorites]


I didn't know that AirBnb had removed people for attending the Charlottesville rally. That one actually does make me a little uncomfortable, since it's essentially a hotel service. Like I totally get why you'd want to prevent hosts from having to rent to horrible racists, but I would be deeply uncomfortable if, say, the Marriott decided to ban protesters for any other event from renting rooms.

I get that the slippery slope is where every single discussion about freedoms is destined to go, it would just be nice if we could collectively agree to add "except for nazi fascists" to all of them freedom statements. Freedom of speech except for nazi fascists. Freedom to bear arms except for nazi fascists.

It seems to me very reasonable to limit the freedoms of nazi fascists given we have historical evidence of where this is going, not to mention the words of hate that these people are spewing from their mouths today. We know this kind of hate speech goes from words to actual violence and murder as evidenced by what's already happening.

If we could just agree that fascist nazis are a special class who deserve to have their freedoms desperately held in check, then maybe a bunch of pain and suffering could be avoided and we could still have the argument about freedom a mile down the road. Humans are complex, wondrous creatures and we can handle something other than binary yes/no ethics.
posted by notorious medium at 10:23 AM on August 21 [11 favorites]


NoxAeternum, I agree with you that there's a genuine distinction between opposing hate speech and suppressing speech because you dislike it. But the CEO of CloudFlare isn't constrained by that distinction. He can (in effect) suppress any content at all, on nothing more than his own whim, and there's (essentially) nothing that anyone can do about it. He used his powers for good in this instance, but there's no guarantee they'll always be used that way.
posted by Zonker at 10:25 AM on August 21


He used his powers for good in this instance, but there's no guarantee they'll always be used that way.

This is true of many, many things in the world that can be abused. So it's not a good argument, but instead an attempt to absolve oneself of responsibility by trying to wash one's hands of it all, instead of acknowledging that the power exists, and as such must be wielded properly. "But this could be used for evil" is not an excuse for inaction in the face of evil.
posted by NoxAeternum at 10:45 AM on August 21 [14 favorites]


I get that the slippery slope is where every single discussion about freedoms is destined to go, it would just be nice if we could collectively agree to add "except for nazi fascists" to all of them freedom statements.

More to the point, just because free speech absolutists are primed to fling themselves down the slippery slope doesn't oblige us to follow. In fact, we need to point out that not only is their position fallacious, but it's enabling genuine harm.
posted by NoxAeternum at 10:48 AM on August 21 [7 favorites]


Cloudflare's position is special because they have an extreme content neutrality policy; they provide their services to spammers and denial of service attackers, and they've provided service to ISIS (don't know if they still do).

This doesn't make CloudFlare's position special, just really shity. Supporting people who are breaking the Internet and providing material support to terrorists doesn't suddenly become acceptable because you've decided to have an "extreme content neutrality policy".
posted by NoxAeternum at 11:06 AM on August 21 [11 favorites]


I find this all so discouraging that if somebody told me there was a direct line between "ate my balls" memes and the rise of fascism in America I would be inclined believe them.
posted by maxsparber at 11:08 AM on August 21 [4 favorites]


Free speech is one of those concepts that gets argued about an awful lot with anyone really understanding. Unlike pornography, we don't know it when we see it.
posted by tommasz at 11:19 AM on August 21 [2 favorites]


Facebook, reddit and Twitter is on a completely different scale.
How many billion active users do these sites have?


I listen to alot of podcasts. And on one of 'em at one point:

20% of the "internet users" have a twitter account. 73% "of the internet" (so I mentally rounded to 75%) are on Facebook. 50% of Facebook users log in at least once a week.

reddit - the mother of a child posted her outrage about the sexual abuse of her under 5 year old daughter by the father. 0 comments in one group, 2 in another and 7 in the 3rd and final group. All depends on where one posts for a spread.

So, yea - having the FacePage and the twitt-ing about gets you on a place where a large hunk of the Internet ends up. And at one time you could narrow your twitter advertising=>phising down to the user you wanted....once you gathered enough intel per yet another podcast and it would cost you a buck or 2. (VS FaceBook saying to me - For $10 have your post boosted to up to 9 more people. Sometimes it was $10 to get the post in front of 4 more people.) But unless you have an audience there - it is shouting into the void just like putting up a web page at an IP address.

Followers and your message to them matter. Facebook. A person with close to 5000 followers. Has their own IMDB page where said person is wonderful and lists as credits the time they were in the audience for a TV show. Used some tool and got 13 people to post "reviews" of the place they just quit from while claiming they were fired. 2 being the dad and step-mom. One had oddly specific complaint and said the fired person was great, everyone else sucked. And no reaction from the 5000 followers other than 'the next job will be better' and 1 'come work with me at (shipping company)' Next post had video with heavy editing claiming how this person was put upon by management and was a call to action by 'fellow social justice warriors' Over an $8 item and under $30 cash. That got 1 star reviews for the ex-employer for days 'till reviews were shut off. With gloating about taking the 4.6 star rating down to 1.43. Pointing out the "phone" they were handed was a blackberry with a keyboard in 2016 and how it sure looked like the 2008 one said person owned as shown in a youtube vid did get 20 people to delete their reviews of the 180 or so 1 star reviews left - but in the 200+ comments about how much they sucked there was an attempt to do the same review-smash at Google+ and Yelp!. Result of the roving band being asked to go elsewhere? 1 negative review on each site, with only Yelp! doing something once a complaint was filed. They are mentioned 1 time of reddit. A take away here? Facebook is pretty darn "sticky" and 'social activism' doesn't extend to actually leaving Facebook.

whispering in a corner of an abandoned building.

Yes. This.

How many people willingly go to Stormfront? Last time I ended up there was on an image match search in a murder investigation I was never paid for. Now I would have found the image on 4chan but for the sake of the investigation I'm glad the search engines index the crap in these "abandoned buildings". I was able to cut off investigation efforts down that blind alley in 4-5 hours. Took longer to image the microsd card than the searching. And "we" learned the kind of people who were being accused of murder
posted by rough ashlar at 11:21 AM on August 21


Freedom to bear arms except for nazi fascists.

2-3 weeks ago the label in the mass media for these very same people was Neo-Nazi or, heck "Republicans".

Did they change in less than a month or did the, dare I say, branding change to an unacceptable brand? Because if its a branding thing than Edward Bernays had things right about labels and branding back when his book was called Propaganda VS today's version being called Public Relations.
posted by rough ashlar at 11:29 AM on August 21 [1 favorite]


I have no problem stipulating that hate speech is bad and has no place in civil discourse. The ultimate problem with the Internet is that the whole concept of a "marketplace of ideas" completely falls apart. In old timey times you'd have to dig through weird magazine ad sections to figure out how to subscribe to an overpriced, mimeographed neo-Nazi newsletter or dial into a crappy white supremacist BBS with only two phone lines. Now we can program a bot to do the work of a million white supremacists on any social media platform you desire. With targeting you can figure out how Aryan you wanna be; maybe Aunt Nancy in Minnesota gets some dog whistling alt-right-lite while Cousin Billy Bob who rolls coal and has a Confederate flag as his avatar gets the full Monty. And bots don't have feelings, so you can just vomit up a bunch of racist garbage into any old twitter feed and prance away psychologically unharmed by any debate or exchange of ideas.

This is why the SCOTUS stuff I talked about in the EFF thread pisses me off. The SCOTUS judges are too out of touch to fully understand that computers are making mass-produced, automated speech on social media platforms. They're cheap, they don't get tired, and they don't get hurt. I truly don't think that this is what was intended by the Founding Fathers or any of the philosophers they relied on in forming their political thinking.
posted by xyzzy at 11:31 AM on August 21 [13 favorites]


It's really simple if you consider content as distinct from form. If the content of the speech is intolerant, it's by definition not intolerant to have a problem with that speech and it should be obvious protecting that speech promotes intolerance. What made making those kinds of common sense distinctions such a hopeless muddle is what gets me. Is it Silicon Valley contempt for the value of content internalized as a social/cultural value or what?
posted by saulgoodman at 12:01 PM on August 21 [3 favorites]


Are we really that upset it took a massive outpouring of outrage to kick bad actors off online platforms? Maybe its a good thing that the self-serving libertarianism espoused by these companies keeps the exiles limited to 'an indefensible crime against humanity'?

Do we want to the threshold to be as low as a few thousand identical angry emails from an template form on Family Research Council?
posted by MiltonRandKalman at 12:55 PM on August 21


Do we want to the threshold to be as low as a few thousand identical angry emails from an template form on Family Research Council?

Yes, because it's not like we're capable of analyzing a situation and determining the difference between actual opposition and a blastfax campaign.

Don't become infatuated with bright lines because they mean less work.
posted by NoxAeternum at 1:06 PM on August 21 [5 favorites]


« Older Bury My Heart at W. H. Smith's   |   Beyond the cheesy fried enchilada funnel cake:... Newer »


This thread has been archived and is closed to new comments