Antisocial Media
March 13, 2018 4:01 AM   Subscribe

Reddit and the struggle to detoxify the internet. How do we fix life online without limiting free speech? By Andrew Marantz for The New Yorker.
posted by valkane (101 comments total) 46 users marked this as a favorite
 
How do we fix life online

Moderation. It’s simple. Just not easy. But it is achievable when you have the resources of Reddit.

without limiting free speech?

This isn’t about free speech. Nazis can get their own fucking blogs. Reddit is not obliged to give them a platform.
posted by His thoughts were red thoughts at 4:27 AM on March 13 [155 favorites]


“I consider myself a troll at heart,” [Reddit CEO Steve Huffman] said later. “My political views might not be exactly what you’d predict,” he said. “I’m a gun owner, for example. And I don’t care all that much about politics, compared to other things.”

"Radical centrist" technolibertarian? Did I get that right? Pretty sure I did. That said,

“Does free speech mean literally anyone can say anything at any time?” [Reddit general counsel] Tidwell continued. “Or is it actually more conducive to the free exchange of ideas if we create a platform where women and people of color can say what they want without thousands of people screaming, ‘Fuck you, light yourself on fire, I know where you live’? If your entire answer to that very difficult question is ‘Free speech,’ then, I’m sorry, that tells me that you’re not really paying attention.”

And they banned /r/UncensoredNews a day or two ago, which is encouraging. I still don't agree with the idea that keeping (the obviously-in-violation) /r/The_Donald around as an outlet for bad people's worst impulses is helpful, but there actually has been slow progress of late. /r/AgainstHateSubreddits keeps track of a lot of the worst stuff (which appears everywhere), and I'm still surprised when admins act on it.
posted by uncleozzy at 4:56 AM on March 13 [32 favorites]


Just a friendly reminder that there is precisely zero difference between pretending to be a [racist/misogynist/pedophile/white supremacist/etc] and actually being a [racist/misogynist/pedophile/white supremacist/etc].

If your sense of humor includes the idea that it's fun to troll people by pretending to be one of the above, then you are actually one of the above.
posted by JohnFromGR at 5:23 AM on March 13 [122 favorites]


Yep, a website that's open to the public is not the same as a public space. I can protest in the street most of the time but I probably can't protest inside of a museum. The answer is pretty simple: ban people freely and let them and the community at large know why you're banning them. If they don't like it they can fuck off to some other website.
posted by runcibleshaw at 5:27 AM on March 13 [4 favorites]


“Does free speech mean literally anyone can say anything at any time?” [Reddit general counsel] Tidwell continued. “Or is it actually more conducive to the free exchange of ideas if we …"
I don't know about anyone else, but I'm thoroughly sick of this particular little rhetorical trick being used time and time again to frame discussion.

"Free speech" is most definitely not the same thing as "the free exchange of ideas", and conflating the two leads to the current state of things, with trolls and Nazis and other sadfucks demanding "free speech" in the name of "the free exchange of ideas" while deliberately acting to turn every arena of and opportunity for public debate into a fucking cesspit like the worst of 4chan & Reddit.
posted by Pinback at 5:30 AM on March 13 [46 favorites]


The characterisation of moderation as fascism is something I could also do without.
“You can’t be a racist pederast and doxx and threaten people here, please leave”

“OMG IT’S LIKE STALINIST NORTH KOREA UR OPRRESSING ME WHAT HAPPENED TO FREEEEEDOOOM?!?!!!1”
posted by His thoughts were red thoughts at 5:37 AM on March 13 [17 favorites]


"Free speech" is most definitely not the same thing as "the free exchange of ideas", and conflating the two leads to the current state of things, with trolls and Nazis and other sadfucks demanding "free speech" in the name of "the free exchange of ideas" while deliberately acting to turn every arena of and opportunity for public debate into a fucking cesspit like the worst of 4chan & Reddit.

Ironically, The_Donald are infamous for banning people for even a hint of criticism of Trump.
posted by jaduncan at 5:39 AM on March 13 [27 favorites]


Somehow democracy has (arguably) survived up until now without anybody having the freedom to go into a McDonald's and hang up a 'meat Is murder' banner, or a calorie chart, or whatever else you might like to display in there. McDonald's doesn't have to let you do that, and there's no reason Reddit has to let you be an asshole, either. "The free exchange of ideas"? I'll bite, what ideas are Nazis supposed to have?
posted by Sing Or Swim at 5:47 AM on March 13 [7 favorites]


(Uh, to clarify my poor choice of words, I wasn't suggesting you'd be an asshole if you were to hang up a 'meat is murder' banner in a McDonald's... just illustrating the point that your God-Given Right To Free Speech doesn't follow you into most private establishments. I'll see myself out...)
posted by Sing Or Swim at 5:52 AM on March 13 [8 favorites]


“I fucked up,” Huffman wrote in an apology the following week. “More than anything, I want Reddit to heal, and I want our country to heal.” Implicit in his apology was a set of questions, perhaps the central questions facing anyone who worries about the current state of civic discourse. Is it possible to facilitate a space for open dialogue without also facilitating hoaxes, harassment, and threats of violence?

OK, so, if you want an infected wound to heal, you need to get the pus out first, before the wound seals or you will develop a much more serious condition. In a similar way, for an online community (much less a country) to “heal,” things like harassment and threats of violence need to be addressed first or there will be festering and later eruptions.
posted by GenjiandProust at 6:00 AM on March 13 [13 favorites]


Reddit is still the place where it takes minutes to ban someone for replying to fascist violence with "bash the fash" but months to remove entire subreddits devoted to beating women.
posted by zombieflanders at 6:08 AM on March 13 [80 favorites]


The question is not free speech or not free speech. The question is whether you want billionaires to set the terms of permissible discourse. Progressives who answer that latter question "yes" are, in my opinion, insanely short-sighted, generalizing to all points in the future the very narrow fact pattern that a fair number of American billionaires prefer Hillary Clinton and Chuck Schumer to Donald Trump and Paul Ryan, and don't think about how a Saudi or Indonesian billionaire might play it, or how even the American billionaires they trust would react to (say) Bernie Sanders compared to Ted Cruz.
posted by MattD at 6:10 AM on March 13 [9 favorites]


Just a friendly reminder that there is precisely zero difference between pretending to be a [racist/misogynist/pedophile/white supremacist/etc] and actually being a [racist/misogynist/pedophile/white supremacist/etc].

If your sense of humor includes the idea that it's fun to troll people by pretending to be one of the above, then you are actually one of the above.


“We are what we pretend to be, so we must be careful about what we pretend to be.”
- Kurt Vonnegut
posted by leotrotsky at 6:11 AM on March 13 [38 favorites]


Also, this article soft-pedals the hell out of criticizing spez et al. The author spent over 1000 words talking about "Place," which is pretty much gone from the collective Reddit conciousness, and less than 100 on the report that showed clear evidence that banning hate subs and users was effective.
posted by zombieflanders at 6:14 AM on March 13 [19 favorites]


Also, this article soft-pedals the hell out of criticizing spez et al. The author spent over 1000 words talking about "Place," which is pretty much gone from the collective Reddit conciousness, and less than 100 on the report that showed clear evidence that banning hate subs and users was effective.

I wonder, to what extent do PR agencies play a role in placing articles at places like the New Yorker? I know they definitely do that at less prestigy rags, but how often are flacks taking writers from the Atlantic or the New Yorker to lunch and soft pitching story ideas?
posted by leotrotsky at 6:18 AM on March 13 [1 favorite]


The question is not free speech or not free speech. The question is whether you want billionaires to set the terms of permissible discourse.

"Permissible discourse" in this case being death and rape threats, speech condoning violence towards marginalized people, calls for murdering opponents of authoritarianism, doxxing and other invasions of privacy, attacks on other internet communities, and encouraging the rise of white supremacy.

Progressives who answer that latter question "yes" are, in my opinion, insanely short-sighted, generalizing to all points in the future the very narrow fact pattern that a fair number of American billionaires prefer Hillary Clinton and Chuck Schumer to Donald Trump and Paul Ryan, and don't think about how a Saudi or Indonesian billionaire might play it, or how even the American billionaires they trust would react to (say) Bernie Sanders compared to Ted Cruz.

I mean, this was already a strawman argument, but the extremely disingenuous "progressives: evil, stupid, or both?" framing is just the dingleberry on top of this particular shit sundae.
posted by zombieflanders at 6:18 AM on March 13 [51 favorites]


By the way, it's not progressives who are the driving force behind limiting free speech: Everything we think about the political correctness debate is wrong
By rhetorically lumping in instances of rare, fairly extreme behavior with much more common behaviors under the broad heading of “political correctness,” it is easy to paint an alarming picture of the hecklers as a leading edge of an increasingly authoritarian political culture.

The fact that there does not appear to be any such trend — and that public desire to stymie free expression is concentrated in the working class and targeted primarily at Muslims — ought to prompt a reevaluation of the significance of on-campus dustups and perhaps greater attention to the specific contexts in which they arise.

Conversely, a clearer and more specific account of what's wrong with heckler’s veto tactics — rather than broad-brush efforts to castigate them as emblematic of a broad social crisis — might be more effective at actually persuading people not to engage in them.

If nothing else, it would be useful for writers to do a better job of distinguishing between how life feels when you participate in unmoderated online exchanges — where being on the wrong end of pile-ons can certainly create the subjective impression that vicious mobs are constantly trying to shut down anything they find disagreeable — from what we actually see in the data, which is a public that is increasingly supportive of free expression, with liberals and college graduates being especially supportive.
posted by zombieflanders at 6:35 AM on March 13 [40 favorites]


If nothing else, it would be useful for writers to do a better job of distinguishing between how life feels when you participate in unmoderated online exchanges — where being on the wrong end of pile-ons can certainly create the subjective impression that vicious mobs are constantly trying to shut down anything they find disagreeable — from what we actually see in the data, which is a public that is increasingly supportive of free expression, with liberals and college graduates being especially supportive.

yeah, it's now a journalistic job requirement to be on twitter for hours every day. i can't help but think that it's giving many journalists a very warped view of the world.
posted by vogon_poet at 6:39 AM on March 13 [15 favorites]


zombieflanders said what I was about to say way better than I could so I'm just going to stand here tapping my nose and pointing at that comment.
posted by EmpressCallipygos at 6:51 AM on March 13 [6 favorites]


yeah, it's now a journalistic job requirement to be on twitter for hours every day. i can't help but think that it's giving many journalists a very warped view of the world.

Many forms of internet discourse follow a Zipf's law distribution where a small minority of stakeholders create a majority of the messages. Combine that with the news media's bias for "if it bleeds it leads" and you end up with a very distorted view of what's going on.
posted by GenderNullPointerException at 6:57 AM on March 13 [19 favorites]


Reddit’s Financial Ties To Jared Kushner’s Family Under Scrutiny Amid Inaction Against The_Donald Hate Speech

As the New Yorker piece points out, The_Donald is actually not particularly popular or active compared to other large sub-reddits. But when I click "Edit Subscriptions," I get a list of major sub-reddits ostensibly listed by popularity. The_Donald is listed third, putting it smack in the middle of my screen, but its subscriber count is 23rd of the 24 listed sub-reddits and 1/10th that of its close neighbors in the list. Most of the 24 results on the next page are also larger than The_Donald. /r/AdviceAnimals is over 8x the size of The_Donald, but its listed 42nd to The_Donald's 3rd.

This unnatural prominence has been a feature of The_Donald for as long as I can remember.

I find it impossible to believe its placement there is organic. I don't find it credible that the people running Reddit really walk the walk of hands-off-freedom and neutrality they do so much public hand-wringing over.
posted by Western Infidels at 7:02 AM on March 13 [48 favorites]


I stopped visiting Reddit (and similar sites, like Digg/Fark/Hubski/&c.) after the "let's find the Boston bomber" debacle in 2013. If anything, my life has improved from that decision.

The question everyone needs to ask themselves is what unique value Reddit adds for them. For me, it was nil - the content I went there for (investigative journalism, some niche hobbies) are better catered to by dedicated websites and niche forums.
posted by svenkatesh at 7:05 AM on March 13 [2 favorites]


spez is a total naif at best.

I will say I haven't seen a T_D post float anywhere near the top of the front page in a while, which is nice. And there are plenty of subreddits that seem pretty well monitored. There's just always that toxicity lurking just below the surface.
posted by aspersioncast at 7:05 AM on March 13 [2 favorites]


Reddit: (Yes, it gets worse.)
posted by suetanvil at 7:13 AM on March 13 [11 favorites]


I'm also interested why no one is following up with (former) members of Reddit's moderation team. In a recent thread, spez/Huffman claimed that "[b]anning [The_Donald] probably won't accomplish what you want" and a former moderator responded (emphasis in original):
Hi Spez,

I was a moderator around Reddit for a number of years, and I found that the admins nearly always chose a policy of inaction on potentially controversial problems like this. It's second from the bottom on my big list of complaints about dealing with the admins. And you know what? It nearly always blows up into a big disaster that is ten times harder to control. I can name a number of examples from old Reddit history that you might remember as well. Here is my comment from when /r/FatPeopleHate was banned, and it's pretty much exactly what we're dealing with today:
The admins have made some serious missteps. First, they should have been addressing shit like this years ago when Reddit first got big enough to start brigading. They let hate subs grow and didn't even make public comments on it. I still remember that when Violentacrez got doxxed, the mods started a ban boycott of gawker sites. Yishan (CEO at the time) then came into the mod subreddit (which is private) and asked us not to do it because it made bad press for Reddit. They didn't even have the guts to make that statement publicly, much less tell off Gawker. Getting the admins to do anything even remotely controversial has been a constant problem.

They were lenient on issues of harassment and brigading because they didn't want to take a controversial stance, and now it has blown up in their faces. And what's more, the Admins themselves have encouraged the exact same behavior by urging people to contact congress on Net Neutrality and all this stuff. They let a minor cut turn into a big infection that went septic, and now they are frantically guzzling penicillin hoping that they can control the damage.

Another huge misstep was the tone and writing of the announcement. They should have very clearly defined harassment as outside contact with specific 'targets' and cooperation of the subreddit's moderators. It was phrased in such a vague way that, in tandem with this post, people were able to frame this as an attack on ideas instead of behavior. They needed to clarify that mocking someone isn't harassment; actually hunting down and contacting the person is. That's why /r/cringe, and even all the racist subs are still allowed. They're despicable, but they aren't actively going after anyone.

In my opinion, they should have presented clear evidence of such harassment from the subreddits that were banned and said "This is exactly what will get you banned in the future." /r/PCMasterRace was banned for a short time because the mods there were encouraging witch hunts of /r/gaming, and the admins provided clear proof of what had happened. The mods then cleaned up their shit, and the harassment stopped and everything went back to normal. That is how it should work: if an active mod team agrees to crack down on any instances of harassment or witch hunting, then the community can stay.
/r/The_Donald has committed blatant violations of pretty much every Reddit-wide rule . And you all refuse to act for one simple reason: you're afraid of how it looks. You're worried that the headline will be "Reddit takes political stance and bans Donald Trump supporters." Which is obviously not the case, since the ban would be for brigading, racism, sexism, etc. But you're worried that you can't control the narrative.

So please realize that this never works. What has always happened in the past is that your policy of inaction lets the problem grow and grow and grow until there is a mountain of evidence that somehow catches the eye of someone in the media, and they publish something damaging about Reddit that eventually spurs you all to do something. But by then it is too late and you've allowed that sort of content to proliferate throughout the site. And it becomes public and you're unable to control the narrative anyway, which is why Reddit was associated for pedophilia for so long after CNN interviewed the founder of /r/Jailbait. Remember that one?

I'm begging you, just once: please enforce your rules as they are written and regardless of how some people might try to interpret it. And when you do enforce those rules, provide a statement that clearly describes the violations and why that enforcement action is being taken. That is the only way you'll ever control the narrative. You can either do it now, or you can do it when it blows up in your face.
posted by zombieflanders at 7:17 AM on March 13 [50 favorites]


I actually found the "Place" bit at the end a good summary of what was going on. They released this thing, and just pray like heck that there wouldn't be swastikas all over it (because they knew that was a very strong possibility!), and hide in their hoodies when they start popping up, with some mild angst that they might get some bad press, as opposed to, y'know, having some plans for dealing with any toxic stuff that might show up.
posted by damayanti at 7:22 AM on March 13 [2 favorites]


We've been talking about this stuff for nearly 30 years, and I fundamentally dispute the notion that common-sense moderation rules on spamming, harassment, personal attacks on private participants, and low-effort shitpost responses constitute a "free speech" issue. One is entitled to express a political opinion. One isn't entitled to shout it a hundred times a day in a public space or in someone's personal mailbox.
posted by GenderNullPointerException at 8:11 AM on March 13 [19 favorites]


I'm also interested why no one is following up with (former) members of Reddit's moderation team.

Minor point, but this is not a thing. There is no moderation team. Each subreddit has its own set of moderators, and there is no explicit mechanism coordinating between them. There are or were informal organizations of moderators (e.g. /r/modtalk).
posted by Jpfed at 8:17 AM on March 13 [1 favorite]


I use Reddit for specific hobby-focused discussion which I don't get elsewhere - mainly a couple of text-only discussion subs around makeup and skincare where people are free to discuss products, techniques and consumer practices without someone popping up to tell us that hey, ladies, we like you much better without all that makeup anyway. (It happens here, and it happens even in specialist spaces generally - each week the Guardian has to delete several such comments under the beauty column, which is far from a puff piece. Let's not go into the assumption that all make-up fans are hetero cis women...) Even there, the structure of the site can almost invite pile-ons - someone says something unpopular, the downvotes arrive, and their comment gets hidden. Great when there are trolls, but proper moderation generally should be taking care of that.

I have occasionally looked at other subs, and it wasn't a pleasant experience. I like reading other points of view, but I don't want to debate politics, say, or feminism with someone who expects me to explain 101-level subjects to them in the name of argument. (some of this may be because the site skews young - LifeProTips illustrates this well.) I don't want to have to provide a tl;dr on a text based site. I definitely don't want to read the relationships section, good god, not after someone argued with me that hymens were not a 'definitive proof' of virginity as they can grow back. I was shocked that places like FatPeopleHate and CoonTown were allowed to stay as long as they did - not because they were just offensive, but wasn't that an embarrassment to have sitting there on your site?

Also, I will say that Reddit seems to be open to trans and gender issues if you go to the right places -or, perhaps, avoid the right places - in contrast with some women's sites like Mumsnet, which have been overrun by 'gender critical' people and generally ruined my enjoyment of everything else. And in some ways, it reminds me of Yahoo Groups of old - little niche communities about succulents or fountain pens or specific TV shows.
posted by mippy at 8:48 AM on March 13 [8 favorites]


Minor point, but this is not a thing. There is no moderation team.

The subreddit moderators do the vast bulk of the moderation work (mostly unpaid), but Reddit itself does have a team of admins that have access to all the same tools and then some. They don't have a dedicated site-wide "moderation team", but they do have a site-wide team whose members all possess moderator powers. In light of which, the absence of a dedicated team for site-wide moderation comes off more as an abdication of responsibility than anything else.
posted by tobascodagama at 8:59 AM on March 13 [7 favorites]


The reddit forums that are strictly topic focused with careful strict moderation are good to great. review /r/spacex or /r/math. Even fanboi snark that's on topic and probably correct (in a mean correct way) can get moderated for hcalmness and politeness. But as we have learned following the metameta grey discussions here, moderation is hard.

When they finally bring out a "forum mod barbie" she'll have the catch phrase "moderation is hard"
posted by sammyo at 9:05 AM on March 13 [3 favorites]


Possibly neither here nor there, but I think Reddit's upcoming redesign will have, predictably, significantly negative impacts on its desktop traffic and subsequently drop its influence across the net. I've been on Reddit for ~12 years, and after switching to the new alpha build my usage has dropped tremendously--almost to zero. Frankly, at this point, I think that's for the best. (And perhaps it will emerge from the flames ala Digg.)
posted by matrixclown at 9:44 AM on March 13


me: googles 'reddit alpha'

google: would you like to join reddit's pick-up artistry subs

me: nevermind about all of this I guess
posted by runt at 9:55 AM on March 13 [11 favorites]


a little discussion on reddit regarding the ban of the uncensorednews subreddit yesterday
posted by exogenous at 9:59 AM on March 13 [1 favorite]


how often are flacks taking writers from the Atlantic or the New Yorker to lunch and soft pitching story ideas?

Rather less often than just sending fully publication-ready copy straight to their inboxes, I would have thought.
posted by flabdablet at 10:23 AM on March 13 [5 favorites]


“I consider myself a troll at heart,” he said later. “Making people bristle, being a little outrageous in order to add some spice to life—I get that. I’ve done that.”

You know, I really do not get this mentality. What value does it bring to offend people for the sake of offense?
posted by NoxAeternum at 10:42 AM on March 13 [29 favorites]


I don't know. My dad liked teasing people. I like teasing people. He mostly only did it to the people who he loved or was close to, and I am the same in that regard. My dad passed before people outside of college tended to use the internet, and after a few years online I ended up feeling that there wasn't very much difference between teasing and trolling online, so I cut it out mostly. At the same time, I think that teasing and pushing people a bit is a thing that is fairly standard for a certain segment of humanity.

Also, as far as offending people, when you live in a complacent close-minded bigoted town, there is some appeal in offending people by acting different. It allows one to frame their rejection as something that they intentionally evoked and feel as if they are the ones doing the rejection, not the ones being rejected, even though they would have been rejected no matter what. That can be empowering. It can also lead to being a contrarian dick if taken too far in petty ways; and, online, many people somehow seem to think that espousing mainstream and bigoted views makes them the outlier, when it really doesn't. So, it is another thing that doesn't translate well to the online world.
posted by bootlegpop at 10:54 AM on March 13 [8 favorites]


You know, I really do not get this mentality. What value does it bring to offend people for the sake of offense?

You and me both, but apparently lots of people do see it as good fun. I don't understand. It just seems like another form or lying to me. (Or worse, you're not lying.) I'm one of those thin-skinned only-children though, so what do I know?

(One of the worst fights my now-husband, then-boyfriend and I had was over his desire to do some very mild trolling on a listserv that I was on and he as a result knew people on through me. He wanted to sign up under a fake name and persona that he knew would rile up people who I considered friends. Not, like, in a Nazi way, but in a very specific-to-the-subculture way that didn't entail hate speech of any sort. But I flipped out on him. To me, that was lying and worse, it was lying to my friends and I could not for the life of me figure out why it was funny. Like, it was funny imagining what they'd say if a real actual person with those real actual attributes showed up on the listserv, but the reason it was funny was because they weren't actually having those reactions, we were just imagining them having those reactions. I was so upset by the idea, I locked myself in the bathroom to cry. So yeah. I extremely do not get this mentality.)
posted by soren_lorensen at 11:02 AM on March 13 [7 favorites]


You know, I really do not get this mentality. What value does it bring to offend people for the sake of offense?

trolling is a subversion of social norms that pushes others into emotional dysregulation (especially those who have sustained real trauma). this makes certain socially underdeveloped folks feel powerful because a) they're still operating on the 90's pseudoscience bullshit about psychological manipulation = power over others and b) they understand social interaction to be a zero-sum game, something that sites like reddit (and MeFi, to a lesser extent) reinforce with 'upvotes' and 'favorites.' and one of the easiest ploys to win 'upvotes' and 'favorites' is to push your 'opponent' into emotional dysregulation which is considered a weakness because it displays vulnerability, something a normal person would go 'oh shit, sorry, I didn't know you felt that strongly about it' but looks to trolls like a touchdown or something

it's a totally fucked up, anti-social approach to basic communication rife with abuse, power-play, and toxicity which is not coincidentally a description applicable a lot of the communities on the internet
posted by runt at 11:03 AM on March 13 [31 favorites]


There is only one way out of this: DO NOT FEED THE TROLLS.

Seriously. Walk away. Do not expose yourself to forums/discussions that are questionable or toxic.

Lots of people suck. The internet has given them a megaphone. Do not stand in front of them and give them an audience.

Walk away. The alternative is that you get ananny who gets to decide what, that day, is considered offensive.
posted by tgrundke at 11:07 AM on March 13


There is only one way out of this: DO NOT FEED THE TROLLS.

We tried this. It doesn't work. Pretending that they don't exist doesn't actually make them go away. And if they're targeting you, then you don't get the luxury of pretending.

The alternative is that you get ananny who gets to decide what, that day, is considered offensive.

I'm going to repeat something I said in another thread - if you're complaining about outrage, you don't actually give a shit about free speech. People voicing their outrage at something is just as much free speech as the provocative comment in the first place. If your reaction to outrage is to dismiss it, then you need to rethink your priors.
posted by NoxAeternum at 11:14 AM on March 13 [40 favorites]


I think that there is a place for teasing, pranks, and hoaxes, personally. I just think that the internet has stopped being a great place for it. I feel that there is kind of a before 4chan and after 4chan divide in that regard, and I feel that if that divide wasn't obvious enough to some people when it happened and they still thought everyone was just funning around, the nazification of vast swaths of people who used to get portrayed as great wits and Loki types, should have made things more obvious to the extent that people who didn't get it after that point seem to either be acting in bad faith or daft. The first opinion expressed above regarding Teasing/etc might have something to do with the fact that I belong to the tail end of gen-x, a generation where being earnest wasn't valued as much, and also the fact that I got online in the era of the BBS and then the early internet, where the idea of doing anything under your real name was out of bounds and being a little bit full of shit wasn't necessarily always a bad thing.
posted by bootlegpop at 11:15 AM on March 13 [2 favorites]


“Don’t feed the trolls” does not WORK.
We tried ignoring them and letting them fuck off to the dark recesses of the internet, and that’s how we got the alt-right.
We CANNOT ignore the trolls. We must fight them.
posted by Homo neanderthalensis at 11:16 AM on March 13 [17 favorites]


The first opinion expressed above might have something to do with the fact that I belong to the tail end of gen-x, a generation where being earnest wasn't valued as much, and also the fact that I got online in the era of the BBS and then the early internet, where the idea of doing anything under your real name was out of bounds and being a little bit full of shit wasn't necessarily always a bad thing.

No, it probably has more to do with never really being the target of teasing, pranks, and hoaxes, the sort that is meant to harm and other you. Having been there, my tolerance for the three has greatly diminished.
posted by NoxAeternum at 11:20 AM on March 13 [11 favorites]


I've definitely been the target of people who teased and pranked for the purposes of harming me, and they accomplished that feat many times. I'm pretty sure that I was teased with the intent to harm me at least 150 days a year between the age of 4 and 14. In fact, I'm pretty sure that there were at least 2000 instances of teasing done offline that made me feel worse than anything done online. That is probably because I was of a generation that lived mostly offline, I imagine.

I still don't think that the prankster figure is one that is always evil or negative, no matter how much of a beating the figure's reputation has taken over the last decade due to the numptys and edgelords who have invoked her and misused her for their evil agendas.
posted by bootlegpop at 11:26 AM on March 13 [7 favorites]


I think that there is a place for teasing, pranks, and hoaxes, personally.

I agree. The benign form of these is called shitposting, and most people don't have much of a problem with it. There was a time when it would have been lumped in with trolling, but personally I'm kind of glad we have the trolling/shitposting distinction now.
posted by tobascodagama at 11:27 AM on March 13 [10 favorites]


Thanks, Huffman, for bringing up trolling and re-framing the whole discussion as "oh, those wry pranksters, always pushing boundaries," even on MeFi. Beautiful damage control, great job.

It's so frustrating seeing this from people who should know how this works, even journalists who cover online culture. Folks on t_d, other hate subreddits, and imageboards don't just post to be provocative. It's fun for them when they can push someone's buttons, but that's maybe fourth on the list of priorities after red-pilling/radicalization of potentially-sympathetic visitors, self-amusement, and hounding anyone they don't like. Frankly we should have removed "troll" from our regular vocabulary years ago; the whole concept in 2018 is good for nothing but covering for neo-nazis and radical misogynists.
posted by skymt at 11:27 AM on March 13 [14 favorites]


shitposting, and most people don't have much of a problem with it

(citation absent).

You are wrong.
posted by Dashy at 11:30 AM on March 13 [5 favorites]


Shitposting is actually one of the only ways to troll me. I'm pretty verbose and I try to put some thought into what I say, so when someone picks a fight with me, and then responds to a response that I might have spent 10-20 minutes writing with tldr or a line of bs, I do actually occasionally want to reach through the screen and...

Trolls exist. They weren't ever very funny. What most negative people are doing on Reddit isn't trolling. Espousing an ideology that believes in genocide isn't trolling. Getting people to think that you are espousing an ideology that believes in genocide to get one over on them isn't something that can be hidden behind the banner of "I was just trolling." There may have been a few people who made points (in lieu of just griefing) while using methods called trolling a very very long time ago online, but the trolling that people do now is low-effort, low-value, and very infrequently funny.
posted by bootlegpop at 11:37 AM on March 13 [6 favorites]


“I consider myself a troll at heart,” he said later. “Making people bristle, being a little outrageous in order to add some spice to life—I get that. I’ve done that.”

Well, to take the spice analogy a bit further: (1) If everyone adds a little spice, then something becomes inedible. (2) Also, one grows tolerant of spice if they eat it everyday, so you begin adding more as time goes on.
posted by FJT at 11:37 AM on March 13 [6 favorites]


Regarding the actual article, it says something that isn't actually directly said in the article that they had to have some special day to delete racist and beastiality forums. Like, that shit isn't just deleted as soon as they become aware of it? Like, beastiality forums are on the edge enough that the company has to make a big deal of it? Maybe they arranged that day/ceremony to literalize and formalize something that was usually more ad hoc for the reporter to see, but if so, they weren't bright from a PR standpoint, and the fact that they felt the need also says something pretty negative about them, so whatever the case is, they look like morons and collaborators.
posted by bootlegpop at 11:43 AM on March 13 [6 favorites]


As my friend who used to work at Twitter said, the best thing about this article is that it shows a little of what life's like inside the Trust and Safety team. That's something I think very few people understand.
I understood why other companies had been reluctant to let me see something like this. Never again would I be able to read a lofty phrase about a social-media company’s shift in policy—“open and connected,” or “encouraging meaningful interactions”—without imagining a group of people sitting around a conference room, eating free snacks and making fallible decisions. Social networks, no matter how big they get or how familiar they seem, are not ineluctable forces but experimental technologies built by human beings.
posted by Nelson at 12:02 PM on March 13 [3 favorites]


seems like it would be healthy to define shitposting and trolling and such in a conversation about it since there appears to be some confusion

the shitposting that i'm used to, for example, are people posting really intentionally shitty memes or Twitter statuses that are sublimely, absurdly ignorant or low-effort. dril, for example, is probably the best, most consistent voice in this with such status updates as 'IF THE ZOO BANS ME FOR HOLLERING AT THE ANIMALS I WILL FACE GOD AND WALK BACKWARDS INTO HELL' or 'who the fuck is scraeming 'LOG OFF' at my house. show yourself, coward. i will never log off'

trolling is someone being an asshole and then taking it to the worst extreme for yuks. for example, asking you to put in most of the labor in an argument (eg. cite your sources) without offering much in return, intentionally diminishing someone's earnestedness about a certain topic, mocking different assertions people make (I do this sometimes and it is very mean), and then there's that racist, sexist, ableist bullshit that gets under people's skin

there's a performative aspect to writing on the internet that I think naturally deindividualizes conversations which leads to both of the above. everywhere you comment, you know you're going to be read by more than just the person you're talking to. it's a show, a demonstration of your rhetorical and logical fortitude. in this sense, shitposting is anti-humor standup for a subversive audience in the same way that trolling is Bill O'Reilly railing at a high schooler on his show because it makes his viewers happy that yet another snowflake is put down. the former is intended to entertain, to make people happy, the latter is about power and control where the fail state is demonstrating any kind of emotional vulnerability since 'rational, emotion-free discourse' is the assumed win state for any discussion

it's kind of like we're all on our soapboxes, shouting as loudly as we can in a dark cavern
posted by runt at 12:13 PM on March 13 [10 favorites]


Classic trolling was a form of bad-faith argumentation where someone presented a controversial opinion or statement for the sake of producing reactions. At least originally, the troll dropped some hints or exaggerations to indicate that it wasn't really serious, although with Poe's law that's less reliable these days. One usenet example involved a screed about declawing to a cat group that generated hundreds of responses.

My working definition of shitposting is a form of short, low-effort aggression that treats the subject of discussion as ridiculous and wastes bandwidth.
posted by GenderNullPointerException at 12:36 PM on March 13 [2 favorites]


there's "trolling" like your grandpa gluing coins to the floor, and there's "trolling" that's purely mean or deeply anti-social behavior. People who are deep into the latter like to use the lack of clarity of the term to make it sound like they're just doing the former.
posted by atoxyl at 12:42 PM on March 13 [23 favorites]


You know, I really do not get this mentality. What value does it bring to offend people for the sake of offense?

If it's purely a case of offense for the sake offense, yeah that's trolling by definition.

But for me, it gets more complicated when we're dealing with something like a work of art that intends to be provocative by design. I mean, that's what a lot of satire is, isn't it? Swift wrote A Modest Proposal intending to provoke certain sensibilities. Which, if you were on the receiving end, must've felt offensive, and he intended that. But it was a hell of a worthy piece.
posted by philip-random at 12:48 PM on March 13 [1 favorite]


I don't think equating trolling with teasing or being snarky is correct. But it's particularly unhelpful to assume that teasing or being snarky is unobjectionable. I mean, they can be unobjectionable, but only under certain conditions: basically, that their intent and effect is to make people happy rather than harm them. The umbrella of "teasing" covers a huge amount of territory, from the gentle, appreciated tweaking of a friend's habits to comments and behaviors that are sadistic and deeply harmful. And snarky, joking comments can amuse--or, when not understood to be jokes, cause lasting damage.

Trolling comments and behavior are almost always examples of nonconsensual sadism. They may amuse people, because, sadly, lots of people enjoy punching-down humor and "practical jokes" like seeing someone bite unsuspectingly into a literal shit sandwich. But that's a social problem we really need to address, not evidence we should all just accept trolling because free speech means we somehow are obliged to tolerate cruelty.
posted by DrMew at 12:49 PM on March 13 [11 favorites]


huh. I guess my exposure to tumblr and insta memes ascribing shitposts to hilariously absurd shitty comments and memes is very different than shitposting as it's used in SA/4chan/reddit. makes sense given the presentation - a site dedicated to images vs one devoted to dialogue means the tools in use will be wielded to different effect. the former for irony (lol I made a shitty thing, everyone look and laugh with me) and the latter for mockery (lol you made a shitty thing, everyone look and laugh at this idiot)
posted by runt at 12:49 PM on March 13 [5 favorites]


and there's no reason Reddit has to let you be an asshole, either. "The free exchange of ideas"?


The reason Reddit lets people be assholes has zero to do with whether corporations can ban speech or whether it is legal to moderate with a heavy hand or anything to do with that. The reason Reddit cannot effectively moderate on issues like that is because Reddit's only selling point, as far as I can see, is that it is effectively 'the old internet'. A barrier to getting in, you can't read many forums without being a member, upvotes/downvotes, very few barriers to being as provocative as you want...it is essentially as close as you can get to a time travel device to 1998. If Reddit moderated more aggressively, it would have no business model.

Which maybe we might desire, but I'm not sure how much we can expect them to do it.

At the same time, I also am finding myself far less tolerant of pranking and shock humor now that we've seen it creating actual Nazis who aren't in on the joke. My spouse also used to delight in making, say, flat-earth posts on the internet, or ones declaring Bernie Sanders to be a Cthonic horror, with citations, or what have you, and it all seemed like harmless fun, until I saw people starting to get influenced and act in the real world based on this stuff. It's fun and games if it's surrealist performance art, it's less fun if someone is going to take it as actual news.
posted by corb at 12:53 PM on March 13 [8 favorites]


reddit is nothing like 1998 on usenet, trust me

and trolling? really artful trolling was never about just teasing people or annoying them - it was about getting them to reveal aspects of themselves or their views that would otherwise stay hidden - and some things like racism were just plain off-limits - anyone who did that crap became a target

i don't call what goes on today trolling - it's just vile people posting vile things and being fascist assholes - when weev got on slashdot with the GNAA (if you don't know what this stands for, it's really just as well), i knew the assholes had taken "trolling" over - actually, they just plain killed it

as far as the reddit admins are concerned - they're doing the bare minimum they can to make it look like they're trying to do something about it - notice how the benefit of the doubt always goes towards the assholes when it comes to free speech unless the public reaction is too strong
posted by pyramid termite at 1:30 PM on March 13 [4 favorites]


But for me, it gets more complicated when we're dealing with something like a work of art that intends to be provocative by design. I mean, that's what a lot of satire is, isn't it? Swift wrote A Modest Proposal intending to provoke certain sensibilities. Which, if you were on the receiving end, must've felt offensive, and he intended that. But it was a hell of a worthy piece.

Swift, like most satirists, was not being offensive for the sake of offense. A Modest Proposal was written about an actual issue, and was meant to be offensive in order to get people thinking. This is different from what Hoffman was discussing, in where he was generating outrage for no real purpose.
posted by NoxAeternum at 1:49 PM on March 13 [7 favorites]


urbandictionary.com's definition of trolling is fairly consistent with mine insofar as it tries to keep value judgment out of it (though "victim" is a pretty loaded choice of words).

The most essential part of trolling is convincing your victim to either a) truly believe in what you are saying, no matter how outrageous, or b) give your victim malicious instructions, under the guise of help.
Trolling requires decieving; any trolling that doesn't involve decieving someone isn't trolling at all; it's just stupid. As such, your victim must not know that you are trolling; if he does, you are an unsuccesful troll.


By this definition, Swift was trolling (ie: deliberately deceiving). What made his troll worthy (for lack of a better word) was, as NoxAeternum just put it, its intention to provoke thinking about a particular issue.

sorry if this is coming across as pedantic or whatever; I'm mostly just trying to get this all straight in my own head
posted by philip-random at 2:08 PM on March 13 [1 favorite]


Yah but attribution of an intention to do social good is a 20-20 hindsight about a piece of art, especially the contextualization of what qualifies as art as a consequence of historical-political dominance. It's circular logic to say this particular work was valid art because of its goals and its results, because leftists and progressives and liberals stand to be beneficiaries of Swift's speech act.

What I'm saying is, hate sectors are odious and I'm glad that I don't have to deal with those specific people in my life, I don't have to deal with much explicit racism even as my environment is permeated by its legacy and by Northern (I.e. Canadian) forms of implicit racism. But I don't buy the argument that progressives are in a privileged position to know and distinguish which art is socially acceptable or frame the discourse as what even falls under the category as recognized art. That's the kind of logic that racist, pro-science modernists used.
posted by polymodus at 2:19 PM on March 13


polymodus, I do pretty much agree with you, which is why the value judgment free urbandictionary definition of trolling works for me (ie: whether "good" or "bad" in intent, it's a tactic that intends deception and likely provocation).
posted by philip-random at 2:31 PM on March 13


Yah but attribution of an intention to do social good is a 20-20 hindsight about a piece of art, especially the contextualization of what qualifies as art as a consequence of historical-political dominance. It's circular logic to say this particular work was valid art because of its goals and its results, because leftists and progressives and liberals stand to be beneficiaries of Swift's speech act.

Huh? All it takes is looking at Swift's work and career to see what he was aiming at - the man was not subtle. Satirists do not write for the ages - they write for the here and now, and it doesn't take hindsight to see their message.

But I don't buy the argument that progressives are in a privileged position to know and distinguish which art is socially acceptable or frame the discourse as what even falls under the category as recognized art.

Good thing nobody is saying that, then. What is being pointed out is that it's not nearly as hard to distinguish satire from trolling as people are making it out to be, because good satirists aren't exactly coy about their message.
posted by NoxAeternum at 2:38 PM on March 13 [6 favorites]


My spouse also used to delight in making, say, flat-earth posts on the internet, or ones declaring Bernie Sanders to be a Cthonic horror, with citations, or what have you, and it all seemed like harmless fun, until I saw people starting to get influenced and act in the real world based on this stuff. It's fun and games if it's surrealist performance art, it's less fun if someone is going to take it as actual news.

Isn't that the essence of the supreme troll though? To play your con so straight that most people can't even tell if you're being serious? I'm sure flat-earth trolls would love nothing more than to get coverage in the NYTimes or instigate fist fights over flat vs round.
posted by laptolain at 3:20 PM on March 13


While trolling may have originally implied highly skilled satire or roleplay back in the usenet days, since the development of html-based BBS systems it's devolved into being argumentative, often in offensive ways, for the sake of argument. If confronted, the troll can duck and cover with, "hey, I was just kidding."
posted by GenderNullPointerException at 5:02 PM on March 13


I am a member of dozens of excellent subs, many of which are the only forums of their kind. Moderators do have to obtain permission from the company before they can create a subreddit or take over one that's inactive. Once approved, a moderator has the power to actively direct or passively observe. Users can and do gather based on shared sensibilities regarding common courtesy. The digital version of the Protestant Reformation is happening every minute on that site, and that's what Place illustrated. The beauty of the platform is its dynamic and unpredictable nature.

Personally I struggle to understand how banning subreddits for anything other other than extreme doxxing/stalking, material threats, or other actual crimes is any different than the silliness of bleeping out certain profanities in radio or television. Censorship doesn't take away the uncomfortable things, people, and ideas in our lives --- it just sends them underground. I'd rather know where these people are and that federal authorities do too, if it becomes necessary. Why push them into the dark web where they are harder to keep tabs on?

To take an example I feel qualified on as a woman with a BMI in the 50s: r/fatpeoplehate. I was subscribed to it before it was banned. It was enlightening to see what someone might say or think about me if the social contract wasn't in play, for anthropological, existential, psychiatric, even utilitarian reasons. Sometimes, I had to turn away if I got the feels too swirled up from viewing... but overall, it's another life experience confirming that there's knowledge to be had even from the fingertips of those who would consider themselves my enemies. Furthermore, aggressive commentary might lead to a cogent discussion if one has the confidence to stand up and debate. Although Metafilter doesn't care for this, 101 level explanations are exactly what many people need to get started on authentic compassionate ideologies.

We can't equate Reddit's underbelly with someone walking into a McDonald's and hanging up a subversive poster or even with hate group members gathering IRL in public spaces. Why? Because it's a completely digital medium involving no physical proximity, and anyone who doesn't like something he or she sees can just turn it off. Or, if outrage serves, plant a subreddit that's exactly the opposite of the one that offends and watch it grow. It seems like a lot of Americans are beginning to forget that free speech isn't just for "good" people.
posted by dissolvedgirl22 at 5:20 PM on March 13 [3 favorites]


The lack of context, misrepresentation of the core concepts at play here, and the display of privilege contained in just four paragraphs is astounding. Not to mention the ignorance of who, exactly, supports freedom of expression. Hint: as I linked to above, it's not the people you're defending.
posted by zombieflanders at 5:28 PM on March 13 [7 favorites]


Oh, and by the way? This right here is some bullshit:

I'd rather know where these people are and that federal authorities do too, if it becomes necessary. Why push them into the dark web where they are harder to keep tabs on?

You should read up on what they do when it's out in the open. Here's a good starter article: FBI's 'Gamergate' file says prosecutors didn't charge men who sent death threats to female video game fans — even when suspects confessed. That piece describes federal authorities dismissing videotaped confessions of doxxing and threats. A white supremacist walked into a pizza parlor just a couple minutes away from where I live and shot up the place because both Reddit and the authorities did nothing.

Your faith is misplaced, and your appalling willingness to let others suffer and die--and make no mistake, people are being murdered because of Gamergate and The_Donald--in the name of some sort of twisted definition of the right to free speech just because you aren't personally harmed is precisely the problem.
posted by zombieflanders at 5:37 PM on March 13 [25 favorites]


Because it's a completely digital medium involving no physical proximity, and anyone who doesn't like something he or she sees can just turn it off.

This is nonsense. The internet is real life. You can't turn off doxxing. You can't turn off targeted harassment. You can't turn off the SWAT team that's been sent to your house.

What you can do if you run the platform is turn of the place where assholes congregate to organise and plan these things.

One for time for the cheap seats: THIS ISN'T ABOUT PEOPLE BEING OFFENDED.
posted by His thoughts were red thoughts at 6:18 PM on March 13 [26 favorites]


Because it's a completely digital medium involving no physical proximity, and anyone who doesn't like something he or she sees can just turn it off.

Uh. Ok. And what happens, after they've turned off, when someone decides to track them down and create physical proximity?
posted by PMdixon at 6:25 PM on March 13 [3 favorites]


I wouldn't be against reddit users and subreddits being heavily moderated/curated but I don't think it will really solve anything. Many communities are already moving to discord. Moderation wont do anything to stop people being swatted on twitch.

I might be accused of moving the goalposts but really they're going to move on their own anyway. We can take the sensible steps on controlling sites like reddit or even shut the whole thing down but as mentioned above "The internet is real life." All of these problems are people problems and human nature problems. Reddit is a convenient scapegoat for today but in many ways it's already outdated and superseded as a social platform.

I'm sure for saying this I'll be hit with the "that's great that you're worried about human nature but for now lets ban the nazis" rhetoric but so be it. I don't see why we can't try to address the systemic problem rather than just play a losing game of whac-a-mole.
posted by laptolain at 6:47 PM on March 13


Am I only the one here when hearing people mention "McDonald's" and "social media" thinks about the mass amount of angry fans that near rioted at various McDonald's last year because of a shortage of Szechuan Sauce?

I mention it also because I don't think McDonald's could just turn that whole thing OFF either.
posted by FJT at 6:56 PM on March 13 [4 favorites]


I'm sure for saying this I'll be hit with the "that's great that you're worried about human nature but for now lets ban the nazis" rhetoric but so be it. I don't see why we can't try to address the systemic problem rather than just play a losing game of whac-a-mole.

Is there any reason why you believe this to be a zero-sum game? I don't think asking Reddit to amp up the regulations in their site requires everyone to abandon the continuing efforts to do the very addressing of the systemic problem you are concerned about.

And even better, perhaps if Reddit is more diligent, it will perhaps curtail a spread of the systemic problem as well.
posted by EmpressCallipygos at 6:56 PM on March 13 [8 favorites]


I don't see why we can't try to address the systemic problem rather than just play a losing game of whac-a-mole.

1) Your assumption that no one is doing both is odd, to say the least.
2) Platforms like Discord are continually cracking down, and are arguably doing more than Reddit has ever done in far less time.
3) The idea that moderation won't do anything to stop people being swatted on Twitch seems to be premised on the idea that they're doing the best they can. They are not. They've spent more time going after women for apparently being barely-clothed tempstresses than they have going after the men threatening to rape and kill them.
posted by zombieflanders at 7:01 PM on March 13 [10 favorites]


Reddit is a convenient scapegoat for today but in many ways it's already outdated and superseded as a social platform.

Okay, so if Reddit is now uncool and is only for olds' then why are some people digging in so much to even the tiniest and smallest of measures to curb what is largely agreed upon to be terrible behavior?

Are you just implying that they are just throwing a tantrum over nothing then? Hmm.
posted by FJT at 7:06 PM on March 13 [5 favorites]


If you're worried that you might sound like you're concern trolling, then maybe don't push back on real evidence with nebulous "I think..." statements that come off as just being contrarian for contrarianism's sake. There's no reason to play devil's advocate here, or really in any discussion having to do with bigoted violence.
posted by zombieflanders at 7:17 PM on March 13 [3 favorites]


I don't see why we can't try to address the systemic problem rather than just play a losing game of whac-a-mole.

That would be because the systemic problem is rooted in the invention of a whac-a-mole machine that we cannot now uninvent, only modify.
posted by flabdablet at 7:26 PM on March 13 [2 favorites]


> Censorship doesn't take away the uncomfortable things, people, and ideas in our lives --- it just sends them underground. I'd rather know where these people are and that federal authorities do too,

Nearly a century of Jim Crow is what we got in the US when we allowed this kind of bullshit to be socially and politically normalized. You think law enforcement didn't know who did the lynchings and the bombings?

These are not "uncomfortable things." Violent acts that result from the normalization of racist and misogynist and homophobic rhetoric are not like some mysterious thing that we can't possibly expect or predict. History is full of examples, and many of them are within living memory of lots of us. Including you.

> To take an example I feel qualified on as a woman with a BMI in the 50s: r/fatpeoplehate. I was subscribed to it before it was banned. It was enlightening to see what someone might say or think about me if the social contract wasn't in play, for anthropological, existential, psychiatric, even utilitarian reasons.

Congratulations on never knowing this before you subscribed? Speaking as a brown dyke in her 50s, I do not need to go to a special website space to discover how and why people think I'm gross and terrible and shouldn't exist. It was normal to hear this when I was growing up and it's still normal in way too many places now for people to say this shit out loud. Maybe I should tell my friends who have been gay-bashed about the value of going to a space like that to find out what people REALLY think of them when the social contract isn't in play.
posted by rtha at 7:34 PM on March 13 [21 favorites]


I wouldn't be against reddit users and subreddits being heavily moderated/curated but I don't think it will really solve anything. Many communities are already moving to discord. Moderation wont do anything to stop people being swatted on twitch.

I don't know what this comment means. Reddit and Discord do different things. Why would I go to Discord or Twitch for the things I go to Reddit for?

And well-moderated subreddits are indisputably better in the vast majority of cases but... is that even the main topic here, so much as what the rules should be for what gets to be a subreddit at all?
posted by atoxyl at 8:10 PM on March 13


Something else that I think factors into this is that the people who run sites like Reddit and Youtube and Twitter (all of which allow large-scale recruiting and radicalizing hate engines that are clearly in constant violation of their stated terms of service) are much less interested in free speech or whatever their current excuse is than they are in not paying people. Moderation is a time-expensive process and training people to be good at it costs even more. The size of the teams that would be required is considerable, because all of these services have made growth their one and only priority without any consideration for how to make their size sustainable or manageable. And that means paying a lot of people. (Reddit has volunteer moderators per subreddit as has been mentioned many times, but those people aren't making sure that posters follow the site rules, only the subreddit rules, which leads to the obvious problems we're having here.) It is absolutely not the case that these services have to be overrun by bigots and ethnonationalists, and it is absolutely not the case that they are so overrun because of any strong moral principles possessed by the site operators. They could be moderated. The bullshit couldn't be completely prevented from appearing, but it could be responded to and prevented from creating a huge community that deceives, recruits, and radicalizes. But in order to do that, the site operators would have to want to do it, and they'd have to pay to get it done. They want the service to grow while putting as little into it as possible and shoving as much in their pockets as they can, and they don't care about anything that doesn't affect those numbers. If somehow tomorrow morning kicking out the bigots became the more profitable option, even including the cost of staffing up for it and everything, by tomorrow evening they'd all have strong, public feelings about not endorsing and strengthening that kind of content anymore.

Basically, I think this is far less about concerns about limiting free speech than the article's framing suggests. For the people actually making the decisions, it's just about money. (That said, there is an obvious problem with Twitter's very inadequate moderation staff having a clear pro-alt-Right bias, but I think that's largely a symptom of it being a small enough team that alt-Righters are able to make up a significant percentage of it.)
posted by IAmUnaware at 10:33 PM on March 13 [4 favorites]


Because it's a completely digital medium involving no physical proximity, and anyone who doesn't like something he or she sees can just turn it off.

Back in the real world, people are influenced by the digital words and images on the internet, and then they go and elect fucking Trump. I don’t want to turn off what *I* see, I want to turn off the lies that are being fed to millions of gullible voters, and the hate groups recruiting alienated teenagers into their white supremacist militia. Vacuous nonsense about just turning it off is demonstrating willful ignorance at this point.
posted by the agents of KAOS at 11:52 PM on March 13 [15 favorites]


We can't equate Reddit's underbelly with someone walking into a McDonald's and hanging up a subversive poster or even with hate group members gathering IRL in public spaces. Why? Because it's a completely digital medium involving no physical proximity, and anyone who doesn't like something he or she sees can just turn it off.

I think this is a popular misconception that's antiquated and lacking insight. I think the internet impacts individual behavior and expectations, both offline and on. we are constantly battered and tossed around by trends, memes, newsbites, and the whole lot, and as much as we don't like to admit it, that shapes how we think and thus the actions we take. how do you install your toilet paper now? how did you find out about Rick Astley? how do you feel about Shiba Inus?

but there's a behavioral component too. different forums have their different social codes that are enforced or not by its members and its moderators. the conversation we have on MeFi is very different from a conversation you'd have in the FYAD subforum on Something Awful, and even more distinct than what you'd see on 4Chan's /b/ or an alt-right Discord chat. where you choose to spend most of your time determines what expectations, ideologies, and social norms you have especially the longer you hang out in that particular space

the Isle Vista mass shooting is an extreme example of this - Eliott Rodger was a fucked up kid but he learned where to direct his attention because of the extremely misogynistic, white supremacist internet space that he was an active participant in. those forums, the private spaces where grotesque ideologies are shared and developed, they become spaces of radicalization much like what you would attribute to a terrorist cell, spitting out the same kind of violent extremism but with a more socially normative white Western ideology

and the folks who worship Rodger in those Incel forums? they might not be making the news but their 'real' behavior, backed by their extreme ideology, has changed profoundly, more than their lack of insight allows them to recognize. the way they treat people of color or women in their lives, how open they are with their hate, how safe they feel knowing that there are spaces for them to exist as fucked up radical extremists - these are support networks for misogyny and white supremacy that allow their users to act on their strong beliefs that were once-upon-a-time loosely held

we need to take social interaction online more seriously. we need to be more intentional about what we engage in, the ideas we spit out, the norms we may be enforcing, whether we like it or not. we need to stop imagining it as a mythic land where the points don't matter and the rules are all made up because the points do matter to people, whether or not they choose to admit it, and the rules allow certain anti-social tendencies to flourish, both online and off
posted by runt at 7:31 AM on March 14 [18 favorites]


Platforms like Discord are continually cracking down

Can you say more about Discord? Curious about Slack too. Both are a newish form of social media in that they are semi-private, invite only. It's hard for a random outsider to see what's going on inside a Slack or Discord, you have to join first. Neither company thought of themselves as social media much at all at first, although Discord now does.

With Discord I fear they began on a path of thinking they didn't have to worry about moderation at all. Let the hate groups hate in their semi-privacy. Which causes all sorts of problems, as discussed here. Discord had a particularly bad problem last year when it turned out the Charlottesville rally Nazis used Discord to help plan a murder. The company swiftly responded to that and banned that group after the fact. They Disord built up a trust and safety team and policy that seems like the right idea. I'm wondering how well it's working in practice now.
posted by Nelson at 8:08 AM on March 14


Julia Alexander: Discord is purging alt-right, white nationalist and hateful servers

That article notes that Discord is actively working with the SPLC, which isn't something I've seen Reddit doing. They also apparently don't have a CEO who thinks being a troll is cool, so there's that.
posted by zombieflanders at 8:17 AM on March 14 [14 favorites]


Not sure why many of you didn't seem interested in reading my entire comment where I specifically mentioned that doxxing, threats, and crimes are serious things and should be moderated as well as policed. Not sure what deficits in law enforcement regarding said incidents have to do with whether or not we should pursue shutting everyone up we feel needs to be shut up before they actually do anything illegal. Not sure why even when I bare my throat to Metafilter, opening up on my greatest flaws and vulnerabilities to prove that my views are real and not trolling/contrarian in intent, after nearly a decade of supportive membership in this site, I am still attacked, mocked, treated as scum. Things have gotten worse than I thought here. The solution to anti-intellectual elitism and authoritarian fantasy in this country is not more of the same from the far left.

I'll see myself out, as they say.
posted by dissolvedgirl22 at 8:29 AM on March 14 [1 favorite]


> attacked, mocked, treated as scum

Your views and assertions have been challenged.

You willingly go to sites where you can read how people insult and degrade you (people like you, anyway) for how you are, not what you say. And you call the response here anti-intellectual and authoritarian?
posted by rtha at 9:12 AM on March 14 [11 favorites]


Here's the thing - when you call hate speech an "uncomfortable thing", I consider that to be arguing in bad faith. I am completely done with the constant diminishing of the actual damage and pain that hate speech does to its targets, done through euphemisms, and so I am now pushing back on it.

You want to make the argument that hate speech is the price of free speech, then make that argument openly. Don't hide it behind antiseptic words like "uncomfortable", "distasteful", "disliked", "unpopular", and so on.
posted by NoxAeternum at 9:20 AM on March 14 [17 favorites]


Not sure what deficits in law enforcement regarding said incidents have to do with whether or not we should pursue shutting everyone up we feel needs to be shut up before they actually do anything illegal.

Yeah, see, once it gets to the point it's illegal, it's already too late. This bullshit Orwellian pre-crime narrative enables the behavior to escalate to the point where people are killed. Also, the First Amendment only guarantees that you can't prevent someone from saying something awful in publicly-owned venues. It doesn't mean that private entities are legally obligated to give violent bigots a platform, it doesn't mean that it's "censorship" to prevent violent bigots from advocating for violence and bigotry, and it sure as hell doesn't mean that anyone that is the target of harassment should just suck it up because it's vitally important that violent bigots be allowed to create spaces that enable violence and bigotry.

BTW, I specifically quoted you saying that we should let the groups thrive in the open so that law enforcement could keep tabs on them. My response addressed that with hard evidence from multiple cases that doing so didn't help. Your ignorance is not our fault.

Not sure why even when I bare my throat to Metafilter, opening up on my greatest flaws and vulnerabilities to prove that my views are real and not trolling/contrarian in intent, after nearly a decade of supportive membership in this site, I am still attacked, mocked, treated as scum. Things have gotten worse than I thought here. The solution to anti-intellectual elitism and authoritarian fantasy in this country is not more of the same from the far left.

You came into a discussion with an uninformed, specious argument based on your personal experience and moral code and used it as an excuse to allow horrible behavior to continue, for no other reason than it wasn't already illegal. At the same time, you discounted anyone's experiences but your own as moral failings and attacks on free speech. You didn't even stop to think that there were people in this thread that have been the targets of attacks before telling them that the real problem was that they were cowards (and apparently leftist fascists) that don't care about free expression. You made claims that were already undercut before you typed them out and didn't even bother to see if the solutions proposed to address the problems were even working before attacking them as useless or even harmful.

The idea that we're the ones being anti-intellectual, elitist, and supporting authoritarianism is laughable.
posted by zombieflanders at 9:27 AM on March 14 [10 favorites]


[Folks are mostly already doing this but friendly reminder - Let's keep the discussion focused on reddit/socialmedia policies, etc, and not make it personal about any one person in this thread.]
posted by LobsterMitten (staff) at 9:56 AM on March 14


More than anything, these stories illustrate to me how there's no "responsible adult" at a lot of these companies. They were founded by kids, who had never been in positions of responsibility before, and grew up in an emerging technology where there weren't really older companies with models of responsibility that were common in the sector.

I think they choose inaction so often because they don't want to make hard or unpopular decisions, or to have to defend them. They want someone else to make them, or for circumstances to make the decision for them. It's easier to respond to something that happened than it is to make an affirmative choice in advance.

And like high school students, they find it very easy to respond to direct incidents/attacks, but very difficult to lay out and enforce a moral code when being pressured by others who say their moral code is dumb/prudish/unfair/whatever. So they go along to get along, convinced that the adults in the room will step in if things start to get out of hand.

But they are the adults in the room, and they're adults who are simply totally unwilling to make moral claims and take responsibility for their communities. "We just manage the space, we're not responsible for how people use it." The buck stops nowhere, because they don't understand how to take responsibility for a company or community. They're much more comfortable being the emergency response team trying to triage disasters ("we're doing our best!") than making hard choices and enforcing standards.
posted by Eyebrows McGee at 1:47 PM on March 14 [12 favorites]


In part it's a design issue. The people who built these systems ignored all of the prior work on the effects of scaling these networks. And so, they maximized a mushy ontology based on "tagging" which is barely an ontology and unworkable if no one agrees about the meaning of the tags in question, maximized the number of users, created a system where advertising stakeholders get access to everyone, and maximized the volume of text over quality. And this happened, in part, out of a belieft that more messages -> more drama -> more clicks -> more advertising revenue.
posted by GenderNullPointerException at 2:17 PM on March 14 [3 favorites]


And like high school students, they find it very easy to respond to direct incidents/attacks, but very difficult to lay out and enforce a moral code when being pressured by others who say their moral code is dumb/prudish/unfair/whatever. So they go along to get along, convinced that the adults in the room will step in if things start to get out of hand.

I think that part of the problem is that the adults hadn't really thought about the ramifications either. And that puts further pressure on developing a moral code - look at how readily people will argue that hate speech is the price of free speech, that curation is censorship, that outrage is thought control. Those concepts didn't happen in a vacuum either.

This isnt just structural, but societal. We live in a culture where the right to commit an act of intimidation and terror is lionized as the demonstration of how our society is free. It took me a while to realize how fucked up that is. There are a lot of people who haven't ever thought about it.
posted by NoxAeternum at 2:30 PM on March 14 [10 favorites]


This is not really about individual behaviour mostly these days, being an asshole on the internet or taking personal pleasure from annoying other people. This is an implicitly or explicitly organized political project, that aims to marginalize, mute, terrorize and remove their political opponents voices from the web and dominate it by shouting and threatening everybody else out. It is many faceted and present in most major social media services, and it is widely used around the world by far-right fringe groups (or, alas, formerly fringe groups) to dominate discussions, spread propaganda, taint or scare their opponents into silence, and appear far more popular than they actually are. And they are sometimes connected with the threat or the actuality of physical violence. These are NOT your father's usenet trolls anymore, they are a monster of a media strategy of the far right. And reddit is among its largest nesting grounds. So this is not about individual freedom of speech mostly, it is about containing a violent orchestrated or emergent campaign of political and social intimidation.
posted by talos at 6:15 AM on March 15 [7 favorites]


So this is not about individual freedom of speech mostly, it is about containing a violent orchestrated or emergent campaign of political and social intimidation.

Yes - this is a terrorist campaign. There's a reason we charge Klansmen burning crosses with more than arson.
posted by PMdixon at 9:27 AM on March 15 [5 favorites]


> We live in a culture where the right to commit an act of intimidation and terror is lionized as the demonstration of how our society is free.

For years I admired the Skokie case as proof of US dedication to freedom of speech. Then one day it clicked how utterly fucked it was that a court basically said: "hey, if these very fine people want to parade swastikas through a town that just so happens to be filled with holocaust survivors, who are we to stop them?"

(simplified I know)
posted by postcommunism at 5:08 PM on March 15 [4 favorites]


Yep, that's the exact realization I had as well. It also caused me to lose a good deal of respect for the ACLU as well, especially when I read their hagiography of the case, and saw how they repeatedly refused to acknowledge exactly what they defended.
posted by NoxAeternum at 5:22 PM on March 15




And just imagine what Reddit would become if they got rid of that one percent.
posted by NoxAeternum at 7:13 AM on March 21 [1 favorite]


Next on the banhammer: communities trading questionable or outright illegal items.
posted by NoxAeternum at 3:55 PM on March 21


« Older The crispy taco Vasquez sold for $.85 cost $1.17...   |   Hello World Newer »


This thread has been archived and is closed to new comments