Does Transparency in Moderation Really Matter?
November 12, 2019 2:57 PM   Subscribe

Shagun Jhaver writes about his research into the effect of moderator deletion commentary on Reddit users. Using a sample of 32 million Reddit posts he characterizes the removal explanations that are provided to Redditors and links them to measures of subsequent user behaviors — including future post submissions and future post removals. posted by pharm (32 comments total) 19 users marked this as a favorite
 
The biggest problem I see with studying this is how different subreddit communities can be. Some subs are fundamentally against teaching or coaching better behavior of users. Subreddits that essentially are based around groups of assholes grouping together to tear down something or some people, seem like moderator action would more be along the lines of keeping out dissent -- at the very least they aren't concerned about nurturing future contributors. It doesn't take investment or effort to find people willing to spew shit and move along.

Anyway, it would be good to mark which subreddit communities the removals sampled were coming from.
posted by GoblinHoney at 3:21 PM on November 12, 2019 [4 favorites]


...our results suggest that taking an educational, rather than a punitive, approach to content moderation can improve community outcomes...
Seems right, regardless of subreddit or other moderated online community.
posted by PhineasGage at 3:30 PM on November 12, 2019 [3 favorites]


AskWomen is a particularly egregious example of this. So many comment removals due to "derailing," or what the mods interpret as "derailing," anyway.
posted by Delia at 3:33 PM on November 12, 2019


I moderate a couple of large-ish Reddit communities, one with about 175k subscribers and another with about 50k. I've only had time to skim this so far, but it's interesting to me. My moderation style depends a lot on the type of community, but I've had to become much more selective about when I provide explanations in both of them, just in the interests of time and sanity.

One of the challenges is the structure of Reddit. Most people aren't consistently active in all of the subs that they subscribe to. Most removals happen because people aren't paying attention - they might not even be reading the comments.

Another challenge is the emotional toll of engaging with toxic users and people who just want to argue every single decision. It might be ideal to have a public record of the reasons for a removal or banning; it's not feasible all of the time. And often, when the response is "suck my dick" you wonder why you wasted any consideration on the person anyway.
posted by Kutsuwamushi at 4:40 PM on November 12, 2019 [7 favorites]


I can say that providing explanations for 100% of the removals to reduce future removals by 20% wouldn't be a decrease in my workload, though...
posted by Kutsuwamushi at 4:42 PM on November 12, 2019 [4 favorites]


So many comment removals due to "derailing,"

Wasn't the whole point of threaded comments that you could iterate every fucking permutation of a tired in-joke like so many typewriter monkeys derail away to your nested content?
posted by Freelance Demiurge at 4:51 PM on November 12, 2019 [13 favorites]


>> I've had to become much more selective about when I provide explanations in both of them, just in the interests of time and sanity.

Kutsuwamushi; I think this is the core question about the meaning of this paper. Do we see thes helpful responses to reasons because they change things? Or because moderators get good at intervening only in the cases where they sense it's going to go well. That's where causal research could be especially helpful.
posted by honest knave at 5:11 PM on November 12, 2019


I didn’t even know Reddit deleted comments other than spam.
I have had zero comments deleted on Reddit, vs dozens of comments deleted here, often my best work. Mundane factual comments like this one seem safe enough, it’s the actually insightful or funny ones that get censored.
posted by w0mbat at 5:36 PM on November 12, 2019 [12 favorites]


I wish that metafilter could discuss reddit objectively for once.

I’ve been reading reddit for 13 years, but last year I rediscovered it through the new app, and came to realize that it is literally ‘the best of the web’.

Of course, I don’t hang around toxic places, and usually I don’t read the comments, and naturally I don’t engage with people there, and especially not with idiots.. Still, the sheer volume of information and stimulation you can find there is out of this world. I recently signed up for 500 new subreddits or more, and I could stay on all day long, just finding out new stuff, if I chose to.

Reddit also changed a lot the last few years: there are many more ‘promoted’ context ads (which you can mute), and the general discussions were toned done and matured: I think that a lot of the stupid conversations disappeared. On the other hand, many threads have 1000’s of inane comments, which only makes one wonder, how many millions of people sit all day and talk to each other over there?
posted by growabrain at 5:55 PM on November 12, 2019 [21 favorites]


I didn’t even know Reddit deleted comments other than spam.

Reddit is not a single forum with a single set of moderators. Reddit is a platform for hosting many different forums, all of which have their own rules and moderators. Posts and comments being deleted is extremely common, but how often and why depends on where you're posting. Check out r/askhistorians if you want an example of a subreddit that is extremely strictly moderated.
posted by Kutsuwamushi at 6:14 PM on November 12, 2019 [18 favorites]


I found Askhistorians interesting never had a comment deleted but made few. As long as the comment provides cites into the information you contribute, deletion seems unlikely but has to add to the answer and answering or adding information is all comments are for. Asking a question, rather then answering can be more difficult. As to Moderation, that history is the theme, so to say there is a built in mechanism for Moderation. "Deleted: no citation. Deleted: does not answer question" etc. Behavior ones seemed rarer the most subreddit but their mods are pretty good.
posted by clavdivs at 6:18 PM on November 12, 2019 [1 favorite]


clavdivs, I'm glad your comments on /r/askhistorians have gone well, but remember that as a Roman emperor you've got a leg up on the rest of us in terms of first-hand historical knowledge.
posted by a snickering nuthatch at 6:40 PM on November 12, 2019 [42 favorites]


r/legaladvice is also strictly moderated. That's a good thing, except every freaking deletion is accompanied by a blurb about its reasoning for the deletion. Maybe these can be filtered on a PC via Reddit Enhancement Suite or something, but on my phone it seems impossible to get rid of them, and therefore often difficult to read.

That being said, I love Reddit almost as much as I love Metafilter, and I actively read comments; it's not hard to avoid Reddit's toxic forums and it's not hard to back out of useless comments either.
posted by lhauser at 7:01 PM on November 12, 2019 [1 favorite]


I love/hate AskHistorians... some of my favourite Reddit is reading the good stuff there, but that is coupled with 90% rate of clicking through to "20+ comments" on an interesting question to see nothing but the auto-mod message and everything deleted.
posted by Meatbomb at 7:06 PM on November 12, 2019 [7 favorites]


I can say that providing explanations for 100% of the removals to reduce future removals by 20% wouldn't be a decrease in my workload, though...

I have found in general a lot of people willing to give outside-the-box advice that, upon examination, boils down to promising getting better outcomes in return for investing more resources in a problem.
posted by mark k at 7:22 PM on November 12, 2019 [6 favorites]


I heard that the reason askhistorians is so heavily modded is that they get spammed by holocaust deniers and they (rightfully) have a zero tolerance to that so they end up deleted everything related to that.
posted by Homo neanderthalensis at 7:54 PM on November 12, 2019 [6 favorites]


That's one of the reasons they delete comments but not the only (or even most frequent) one. Most comments that are posted to r/AskHistorians are removed because they aren't in-depth, comprehensive answers. They want to be a place people can go to get reliable information about history, so they have high standards.

The flow of low-effort junk answers is never-ending, though.

And tbh this is probably the reason the majority of comments are removed on any academic subreddit that has enforced quality standards. It sure is on the one I mod.
posted by Kutsuwamushi at 8:13 PM on November 12, 2019 [3 favorites]


our results suggest that taking an educational, rather than a punitive, approach to content moderation can improve community outcomes

Deletion alone is called "punitive" but deletion plus explanation is not? Interesting study but I feel like rather than this contrast, it shows that punishment plus instruction was better at securing compliance than punishment alone.
posted by save alive nothing that breatheth at 8:49 PM on November 12, 2019 [2 favorites]


r/legaladvice is also strictly moderated. That's a good thing

Unfortunately, it appears that some of those mods are cops who are known to delete comments and nuke threads that are critical of cops.
posted by J.K. Seazer at 10:06 PM on November 12, 2019 [6 favorites]


Reddit: some of the mods are cops
posted by eustatic at 11:15 PM on November 12, 2019 [11 favorites]


It seems to me that there are different kinds of rules/explanations-- some are very objective, like "you posted a video and there's a rule against posting videos", while something like "don't insult other posters" leaves more room for argument.
posted by Nancy Lebovitz at 3:05 AM on November 13, 2019


Useful for my future study, Can Anything Meaningful Be Learned from Reddit?
posted by Miko at 5:44 AM on November 13, 2019 [1 favorite]


oh shit he was a TA for one of my AI classes last year! I had no idea he was working on such cool stuff, or even that such socially relevant research even happens here at GT. For undergrads at least, the CS curriculum feels like a pipeline to big tech companies so I'm always happy to hear that this sort of social computing research has a place here. Thanks so much for posting!!!

now to find a way into an interesting research lab...
posted by scruffy-looking nerfherder at 6:02 AM on November 13, 2019 [4 favorites]


Reddit has grown in traffic and mindshare to the stage where it include so many conversations and well-moderated subreddits that could be classified as, yes, "the best of the web" – but it also includes much of the worst of the web. Not as bad as 4chan certainly, but still really very bad. In that respect it's hard to compare it to Metafilter any more than you'd compare Mefi to YouTube or Facebook.
posted by adrianhon at 6:04 AM on November 13, 2019 [7 favorites]


Of course, I don’t hang around toxic places, and usually I don’t read the comments, and naturally I don’t engage with people there, and especially not with idiots..

This is, once again, the "if I can't see the toxic waste, it doesn't exist" argument. Just because you close the blinds on toxic waste doesn't make it stop existing, nor does it make its harm go away. Reddit was at one point the largest white supremacist website out there just from all of the white supremacist subreddits - and it took an absurd amount of bad press for Reddit to take any action on it. Or look at "The Fappening" - Reddit refused to take any action on what was fundamentally a major release of revenge porn until it came out that the release had images of underage individuals, and thus Reddit could be held liable - at which point they nuked it from orbit.

Yes, Reddit has a lot of good stuff - but it also has a lot of harmful stuff, which the management refuses to deal with until their hand is forced.
posted by NoxAeternum at 6:32 AM on November 13, 2019 [9 favorites]


Kutsuwamushi: My moderation style depends a lot on the type of community, but I've had to become much more selective about when I provide explanations in both of them, just in the interests of time and sanity...Another challenge is the emotional toll of engaging with toxic users and people who just want to argue every single decision.

I'm one of the mods of a group that's in the ballpark of your larger group, and I don't think that I have provided a reason for deletion since the first few that I did, for that reason; the rules have been posted for a while and they're pretty straightforward, but some people will act as if that is the very hill that they will die on.
posted by Halloween Jack at 7:08 AM on November 13, 2019 [1 favorite]


It’s probably worth noting that (if I read the paper correctly) this research applies to reddit post submissions, not reddit comments.

I’d love to see some follow-up research on whether the same effect is seen for comment deletion, but I don’t know whether that would be possible given the data available.
posted by pharm at 7:24 AM on November 13, 2019 [1 favorite]


Yeah, hey folks, from TFP, this is strictly about submissions, with an N of nearly 80 million, so I'm not sure it matters that it's not separated out by subreddit.
posted by Lutoslawski at 8:52 AM on November 13, 2019 [1 favorite]


If you can do it with bots, great. But if it's a human moderator, the emotional toll of creating the comments could be heavy. To a certain type of griefer, reasons given are an excuse to argue further.

I have been a mod. Sometimes you have to throw up your hands, say "Reasons are for reasonable people", and stop trying to explain.
posted by elizilla at 9:13 AM on November 13, 2019


Reddit can be wonderful. Last summer, I bought an antique print from Japan in Paris from a dealer who spoke no English, and I speak no French. So other than a penciled date on the paper the print was mounted on, I had zero info other than "this is pretty to me."

I posted links to photos of it on a subreddit with as much detail as I could give. Within 12 hours, a Redditor had identified the print, down to the book it was originally from and the page number, along with a link to the catalogued, documented print (from 1858) at a Boston museum with the exact year! This person even elaborated on details in the piece, told me about an odd-looking thing on the floor (it was a musical instrument), and what the figure in the print was doing!

Reddit can be a toxic hellhole, but it also has a lot of very positive aspects.

edit: Subreddit was r/WhatIsThisPainting
posted by SoberHighland at 10:29 AM on November 13, 2019 [3 favorites]


Reddit also changed a lot the last few years: there are many more ‘promoted’ context ads (which you can mute), and the general discussions were toned done and matured: I think that a lot of the stupid conversations disappeared.

This is a Twilight Zone comment for me to read. While it certainly is possible to unsub from all of the defaults and go about creating your own pleasant bubble experience, on the face, what you're suggesting is absurd. Reddit, as it grows, can only draw in more and more of the unsavoury types that flood subs and comment sections with absolute hogwash. Also, at this time Reddit seems to be drawing in a great deal of teenagers, going through a lot of the same cycles, identity-tied contrarianism, edginess, and immaturity I'm sure many of us recognize or did ourselves growing up in online forums.

I don't do a lot to curate myself a nice reddit, I'm not sure I could. Even the, like, NSFW subs just around for someone to post erotic photo or cosplay of themselves get nasty in the comments in the comments, not in the perverted way you might expect but just meanness to themselves or the person they are there to ogle. Browsing /r/popular is basically a guaranteed way to get knee deep in toxic waste, I say that even after blocking/ignoring some subs I know fundamentally could never produce healthy content in any shape or fashion, yet still consistently float up to the annuls of common feeds.

Refocusing back to the article, comment submissions, not comments, again, I would still assert the specific subs and mods scraped play a role in whether or not the hypothesis is relevant or not. There are definitely well moderated subs with focuses and communities that lend themselves to it. There are others where any and all moderation is seen and treated as a hostile act, or where moderator action seems mostly arbitrary or worse. Ultimately I suppose the point is the same though, in subs with mods and subs full of "giving a shit," it seems the extra work of explaining submission removal reasons can result in a mild decrease of repeat offenders. I would think this most relevant in subs with relatively few number of common submitter rather than hordes of folks constantly submitting low effort content that may or may not be allowed any given moment at the sub.
posted by GoblinHoney at 10:35 AM on November 13, 2019


What I find most interesting in this research is that canned responses resulted in about the same results as more uniquely generated ones. Maybe I find it interesting because it falls in with my anecdotal experience, which is basically:

- If people are genuinely making mistakes, and you inform them what the mistake was, some people will argue, some will bail, and some will (whether initially upset or not) learn and keep going.

- If you briefly explain deletions of content (submissions in this case) linked to clear guidelines, it gives everyone else some understanding of what happened and they can continue to participate with more confidence. I read this article yesterday and I'm not sure I'm clear on whether they looked at the tendency to post in aggregate or with specific users. So it might be that the reasons make other people feel comfortable to keep submitting, or even to submit more, as opposed to being about that one specific user.
posted by warriorqueen at 8:58 AM on November 14, 2019


« Older Metafilter's favorite librarian has opinions about...   |   Ever want to Office Space some old equipment? Newer »


This thread has been archived and is closed to new comments