Comment Moderator: Most Important Job in the World
March 4, 2019 9:53 PM   Subscribe

Last week, YouTube did something unprecedented. Awash in criticism over the discovery of a network of child predators using the platform’s comment sections to share timestamps and screenshots of underage users from implicitly sexual angles, the company disabled comments on almost all videos featuring minors. Only a small number of channels featuring minors would be able to stay monetized — as long as they “actively moderate their comments.” The decision, made by a company that has long stressed the importance of algorithms, seems a tacit acknowledgement that human moderation is currently the best solution for policing harmful content.

Moderating content and comments is one of the most vital responsibilities on the internet. It’s where free speech, community interests, censorship, harassment, spam, and overt criminality all butt up against each other. It has to account for a wide variety of always-evolving cultural norms and acceptable behaviors. As someone who has done the job, I can tell you that it can be a grim and disturbing task. And yet the big tech platforms seem to place little value on it: The pay is poor, workers are often contractors, and it’s frequently described as something that’s best left to the machines.

...Older online communities like Slashdot, MetaFilter, or even Fark proved that human moderation alongside a clear set of user guidelines can work. Sure, the scale was much different than the platforms we have now, but the method was relatively effective. Right now, we don’t have enough human beings looking at what is being put on the internet, and the few that are don’t have the resources or support to do it well.
posted by Bella Donna (28 comments total) 25 users marked this as a favorite
 
The comments on that article are... mostly spam.
posted by monospace at 10:21 PM on March 4, 2019 [5 favorites]


So here's something horrible, Gab browser extension puts a far-right comments section on every site. I noticed the only two non-spam comments mentioning this and did a quick google. Wow.
posted by Homo neanderthalensis at 10:55 PM on March 4, 2019 [13 favorites]


If I could be paid a liveable wage and work from home I’d love to be a comment moderator for something like YouTube.
posted by gucci mane at 10:57 PM on March 4, 2019 [1 favorite]


Maybe you're tougher than I am, but I feel like you should at least hold out for some really good health insurance as well:
For instance, last week, the Verge published an explosive look inside the facilities of Cognizant, a Facebook contractor that currently oversees some of the platform’s content moderation efforts. In the story, employees who requested anonymity for fear of losing their jobs described the emotional and psychological trauma of their work. Some smoked weed during breaks to calm their nerves. Others described being radicalized by the very content they were charged with policing. Most made just $28,000 a year.
posted by en forme de poire at 11:10 PM on March 4, 2019 [18 favorites]


So here's something horrible, Gab browser extension puts a far-right comments section on every site.

Maybe I'm missing something, but this seems far preferable to having their hate speech visible to everyone who visits an ordinary site - now it's only visible to other Gab users. And they already have plenty of other platforms to chat and conspire with each other, so it's not like being able to comment to each other in a sidebar is any worse than commenting on a chat board with a link to the article they're talking about.
posted by Umami Dearest at 11:28 PM on March 4, 2019 [9 favorites]


Well, the Trump subreddit is very vigilantly moderated. Over time, ideologies change. God help us if Facebook and YouTube execs turn right wing and heavily moderate all activity on their sites to align with their views...
posted by xdvesper at 12:35 AM on March 5, 2019


Engadget:
Now, Gab has come back online and has found a new hosting provider in Epik. According to a blog post published on November 3rd, Epik CEO Robert Monster spoke out against the idea of "digital censorship"...
The 2019 writers are just phoning it in.
posted by flabdablet at 12:50 AM on March 5, 2019 [13 favorites]


"..Only a small number of channels featuring minors would be able to stay monetized — as long as they “actively moderate their comments.”

Interesting.

This imperative thing that must be done.


Youtube relinquish all responsibility buy just ... stopping all comments and then put the onus on those (presumably innocent) content creators who still want to communicate with their fans/patron/friends for moderating their own comments.

That's not a huge overhead at all.

Nope.

Especial as there is no clear definition of what 'active' moderation is.
posted by Faintdreams at 3:02 AM on March 5, 2019 [9 favorites]


Another rash decision that will further alienate the platform from the thousands of creators who are the very soul of Youtube. Google now seems to be in panic mode, overreacting to any potential outcry by making up draconian policies that harm small and large channels.

Most channels should just disable the Youtube comments and embrace the Patreon+Discord first model that has proven to be both more profitable and wholesome from a community POV. At least until Google realizes that driving engagement away from their platform probably wasn't the best idea and decides that they should start charging money for hosting videos.
posted by Foci for Analysis at 3:25 AM on March 5, 2019


Maybe the whole host content by creators for free model is broken from the start.
posted by timdiggerm at 4:08 AM on March 5, 2019 [11 favorites]


Youtube relinquish all responsibility buy just ... stopping all comments and then put the onus on those (presumably innocent) content creators who still want to communicate with their fans/patron/friends for moderating their own comments.

That's not a huge overhead at all.


It's worse than that. This is another example of YouTube being broken by design (or more that the design has refused to evolve.) For those who don't know, YouTube has an archaic design where channels are locked to users, a holdover from their earliest days. The result is that if you want to do anything with a channel - upload video, moderate comments, etc. - you have to give the keys to the castle to do so. (Dan Olsen talks about this some in his YouTube Heroes video, as being part of why there was a backlash from creators over this.) It's worth noting that none of the other major players, like Twitch, do this - they keep channels and accounts at least somewhat separate, which is why Twitch streamers can and do have moderation staff.

The result is that while many YouTube channels would love to have moderation (and would in fact spend the money themselves for it!), they don't because of that structure. It is yet another example of how Youtube's problems are based in Youtube's actual design.
posted by NoxAeternum at 4:16 AM on March 5, 2019 [21 favorites]


There is enough memetic force out in the aether these days, I do not think you have to be on youtube to make it big. The more we get people back to small scale / community or union owned type stuff the better.

What kind of % does Patreon skim?
posted by Meatbomb at 4:31 AM on March 5, 2019


> What kind of % does Patreon skim?

5%, according to the recent news about Facebook coming out with a 'Patreon Killer' that steals 30%.
posted by I-Write-Essays at 5:58 AM on March 5, 2019 [3 favorites]


5%, according to the recent news about Facebook coming out with a 'Patreon Killer' that steals 30%.

Technically 10%, but half of that are merchant fees for the transactions.
posted by NoxAeternum at 6:16 AM on March 5, 2019 [3 favorites]


Would that make Facebook's number 35%, or did it include the merchant fees?
posted by I-Write-Essays at 6:32 AM on March 5, 2019


wait so, there's three things here

a network of child predators using the platform’s comment sections to share timestamps and screenshots of underage users from implicitly sexual angles


What the holy fuck? Why is google's answer to these bad actors to disable ALL COMMENTS on potential videos? It's like, a mall's answer to shoplifting is to shut down the stores that have shoplift-able items!

the company disabled comments on almost all videos featuring minors.

How do they know which is which? Is it like... any video with a crowd, any video with an image? What about cartoon kids? This reminds me of the Slate Star Codex conversation last week, where SSC wrote that eventually repugnant actors will insert their bad actions through coded speech into every conversation space, making moderation virtually impossible.


Only a small number of channels featuring minors would be able to stay monetized
That's very different than disabling comments. Are they also demonetizing channels that feature minors? See also my mall example above.
posted by rebent at 6:47 AM on March 5, 2019 [6 favorites]


The result is that while many YouTube channels would love to have moderation (and would in fact spend the money themselves for it!), they don't because of that structure. It is yet another example of how Youtube's problems are based in Youtube's actual design.

Hell, most of them have fans that could be trusted to do it for free. I didn't realize their moderation tools make Reddit look modern.
posted by zabuni at 6:53 AM on March 5, 2019


What the holy fuck? Why is google's answer to these bad actors to disable ALL COMMENTS on potential videos? It's like, a mall's answer to shoplifting is to shut down the stores that have shoplift-able items!

Because it's easy. Again, the moderation issue is a structural one (channels would happily self-moderate if they could, but cannot due to YouTube's archaic design), and turning off comments is easier than uncoupling channels and accounts.

Of course, it's also done nothing to fix the actual problem, because that's driven by another YouTube failure by design - the "engagement"-maximizing algorithm.
posted by NoxAeternum at 6:58 AM on March 5, 2019 [7 favorites]


Maybe I'm missing something, but this seems far preferable to having their hate speech visible to everyone who visits an ordinary site - now it's only visible to other Gab users.

On the other hand, the problem at YouTube was people using comments to exploit and victimize children. Pushing predators off to secret backchannels might make them harder to spot. "But capricorn, Gab isn't for sexual predators, it's for alt-righters! That's totally different! It's just the people that support, enable, elect, and lionize sexual predators."
posted by capricorn at 7:05 AM on March 5, 2019 [5 favorites]


Only a small number of channels featuring minors would be able to stay monetized
That's very different than disabling comments. Are they also demonetizing channels that feature minors? See also my mall example above.


I read the article and reporter Ryan Broderick never explains why he thinks that disabling comments equals demonetizing. Presumably the monetization comes from ads, which are more likely to run on channels that either have no comments or moderated comments, so I think he's got it backwards.

Anyway, "actively moderating" comments on Youtube simply means that you can't leave the door open for unmoderated comments, which really should have been the default setting from Day One. There's no rule that says you have to review your comment queue every day.
posted by Umami Dearest at 7:09 AM on March 5, 2019


The result is that while many YouTube channels would love to have moderation (and would in fact spend the money themselves for it!), they don't because of that structure. It is yet another example of how Youtube's problems are based in Youtube's actual design.

That's so odd because I watch live streams on Youtube and those have moderators (who do a great job!).
posted by Foci for Analysis at 7:14 AM on March 5, 2019


Pushing predators off to secret backchannels might make them harder to spot.

No, I think the world is also much better off if the children who look at kids' YouTube channels don't have to wade through a sea of filthy comments, and they're all hidden on some perverts' specially installed sidebar instead. Again, there are plenty of places for them to chat and conspire already; let's not let them normalize their sickness while exposing everyone else to it.
posted by Umami Dearest at 7:16 AM on March 5, 2019 [7 favorites]


might make them harder to spot.

It's also not as if having commenters who make perverted comments easier to spot has led to mass arrests or anything. If anything it's made it seem like it's acceptable behavior.
posted by Umami Dearest at 7:18 AM on March 5, 2019 [10 favorites]


That's so odd because I watch live streams on Youtube and those have moderators (who do a great job!).

That's because they're competing with Twitch there (and in fact rolled out their streaming platform after Twitch was bought by Amazon), and as such have to compete with them featurewise.

I read the article and reporter Ryan Broderick never explains why he thinks that disabling comments equals demonetizing. Presumably the monetization comes from ads, which are more likely to run on channels that either have no comments or moderated comments, so I think he's got it backwards.

For most channels, revenue is primarily driven through community engagement, which in turn is built on rapport in the comments, among other things.
posted by NoxAeternum at 7:21 AM on March 5, 2019 [2 favorites]


For most channels, revenue is primarily driven through community engagement, which in turn is built on rapport in the comments, among other things.

Adding on to this: pretty much every moderately-sized YouTuber also runs a Patreon, because Google's ad payouts are both too anemic and too capricious to really make a living on anyway. A number of slightly-larger channels have gone as far as to proactively disable their own monetization and rely entirely on Patreon/PayPal support precisely because they determined that dealing with YouTube's completely arbitrary rules about what is and is not monetisable was more trouble than it was worth for them.
posted by tobascodagama at 7:51 AM on March 5, 2019 [4 favorites]


"But capricorn, Gab isn't for sexual predators, it's for alt-righters! That's totally different! It's just the people that support, enable, elect, and lionize sexual predators."

The beauty of the Gab browser extension is that anybody with a sense of spurious grievance huge enough to make using it their only way to publish their droolings will also be witless enough to believe that their only audience is others like themselves, which will eventually turn that extension into a total bonanza for any law enforcement professional short of a terrorist suspect or two for this week's quota.
posted by flabdablet at 7:57 AM on March 5, 2019 [7 favorites]


pushing dangerous users to platforms like gab should make it easier to find them, right? like, the FBI can create gab accounts.
posted by es_de_bah at 10:58 AM on March 5, 2019 [3 favorites]


Umami Dearest: "It's also not as if having commenters who make perverted comments easier to spot has led to mass arrests or anything. If anything it's made it seem like it's acceptable behavior."

You, flabdablet, and es_de_bah all made really good points and I agree with them, but one thing I'll note is that I think this exact thing is the actual problem: moderation of large platforms like Youtube and Twitter that is...naive in its defense of "free speech" at best, though personally I suspect it's more likely to have been infiltrated and controlled by the alt-right themselves.
posted by capricorn at 6:47 PM on March 5, 2019 [1 favorite]


« Older Poop Wars and the Poop Drug Cartel   |   A better James Dean than James Dean Newer »


This thread has been archived and is closed to new comments