Freedom of speech in the digital age - "Speech that disseminates ideas is more valuable than speech whose purpose is to intimidate others." [more inside]
"Comments on Times stories are moderated by a team of 14 people known as the community desk. Together, they review around 11,000 comments each day for the approximately 10 percent of Times articles that are open to reader comment. To help illustrate how our moderation works and how a new system might help, we have arranged for you to take a Times moderation test."
Steam's turned toxic, and Valve doesn't care. A tale of community vs. technological moderation. [more inside]
THE SECRET RULES OF THE INTERNET: The murky history of moderation, and how it’s shaping the future of free speech [Potentially NSFW - language & content]
The Guardian Investigates What Goes On "Below the Line" Comments allow readers to respond to an article instantly, asking questions, pointing out errors, giving new leads. At their best, comment threads are thoughtful, enlightening, funny: online communities where readers interact with journalists and others in ways that enrich the Guardian’s journalism. But at their worst, they are something else entirely. [more inside]
Going forward, the Guardian will refrain from allowing comments on articles discussing sensitive issues such as "race, immigration, and Islam". Per Mary Hamilton, executive editor, this move is necessary in order to address "a change in mainstream public opinion and language that we do not wish to see reflected or supported on the site".
Say goodbye to online comments as you know them We have finally realized that the kind of person who devotes his day to arguing with strangers anonymously on the Internet is not necessarily representative of a large swath of public opinion or necessarily good at articulating anything. [more inside]
In an attempt to curb in-game harassment, online gaming communities have tried to develop a variety of workable solutions. One of the most prominent of these communities has been League of Legends (previously, previously), an extremely popular game that uses a virtual judiciary of gamers' peers, among other tactics, to identify problem players and mete out consequences. Two years ago, the tribunal drew public attention when it chose to expel a professional player from the game for a year (potentially ending his gaming career) for harassing other players. But is it working? Preliminary data indicates that the system is helping.
Writing for Agence France-Presse, Rob Lever details the struggles of major news organizations and online content aggregators to keep comment sections from devolving into ‘pie fights’ at best to hateful and abusive at worst. Some sites have simply eliminated comments rather than deal with the negativity. In 2014, The New York Times and The Washington Post announced that they would form a partnership, the Coral Project, aimed at creating a commenting system that, “might diminish the ‘incentive to be the loudest voice’ and would foster communities of commenters[.]” [more inside]
The Rise of the Reducetarian (yup there's a word for that) Like tofurky? Thank a reducetarian. Part-time vegetarians are the ones driving vegan restaurants, vegetarian blogs and meat-free options at restaurants and grocery stores. [more inside]
After last month's vow to curb targeted harassment and make the site a safer platform for all users, the admins of Reddit began making good on that promise yesterday by banning five offensive subreddits deemed guilty of doxxing, brigading, and otherwise tormenting others, including /r/fatpeoplehate -- a militantly anti-HAES forum whose attacks had recently extended to the admins of popular image host Imgur. In reaction, the 150K subscribers of FPH and their sympathizers in other fringe subreddits went on a rampage, creating countless clones (all banned), filling the front page with hate posts, and disregarding the veneer of free-speech activism to viciously slander Reddit CEO Ellen Pao personally. The dissenters advocate a mass exodus of the hate subs to Voat.co [obligatory_wonka.gif], a moderation-free clone of Reddit that has already crashed under the traffic. Ongoing coverage by the enlightened popcorn-munchers of SubredditDrama. [more inside]
"Spend enough time in any community or social circle – whether online or in person – and you’ll inevitably hear people complaining about the group being too insular, too much of a circle jerk or just plain unwilling to listen to people who disagree with them. You may especially notice this when forums have active moderation or websites and YouTube accounts turn off the ability to post comments. Now, on occasion, you will find a group or community that is unwelcoming to divergent voices… but more often than not, the problem isn’t that people are unwilling to hear an opposing opinion, but rather a case of “we don’t like assholes in the clubhouse.”--How To Share Your Unpopular Opinion (Without Being An Asshole)
Reporting, Reviewing, and Responding to Harassment on Twitter [via mefi projects] For three weeks last November, Women, Action, and the Media (WAM!) accepted harassment reports that they escalated to Twitter, collecting data on the experience of harassment and the process of reporting it. A team of academics published a comprehensive report on what they found, with a focus on the people reporting and receiving harassment, the kinds of harassment that were reported, Twitter's response to harassment reports, the process of reviewing harassment reports, and challenges for reporting processes. [more inside]
So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed
[W]e may not stop to think much about moderation as a form of labor that composes the Internet. But as the need to grant the audience “a voice” has become conventional wisdom, almost every media organization now needs this work done. [...] This complex tension—between voice and civility, eyeballs and deliberation—is one that future-of-news enthusiasts are good at waving away, but that comment moderators must bear. Within representative democracy, we can think of moderators’ bodies as being like that element of an electronic circuit that dissipates excess energy and allows it to function. They absorb the excess affects in a period of political dysfunction, and allow institutions to appear stable and unchallenged.Jason Wilson argues that, in the comments section, "the facade of liberal democracy only stays clean by putting young women [moderators] in hate’s way."
My elevator pitch for ending sobriety had been “moderate social drinking without ever blacking out again.”
Gawker: We want to elevate the discourse about frogs who sit like humans. No matter how you personally feel about the sites, you've got to admit that the Gawker network is big. So far in April 2012, the eight sites have attracted 1 million comments on 7,500 posts from 130,000 active commenters. But with comments described by Gawker's editor A.J. Daulerio as 'a tar pit of hell', they've decided to try to reinvent their commenting system again, including a system to allow commenters to sign in with temporary, anonymous, throwaway 'Burner' accounts. [more inside]
If your website is full of assholes, it's your fault. from Anil Dash. [more inside]
"Publishing anonymous, unvetted, and unreviewed commentary online is hugely divergent from the policies of [mainstream media] publications' print editions. It's a different kettle of fish, one that can stink for the publishers. Indeed, those publishers and their new-media managers are being reckless." [more inside]
"The amount of time it would take for the community to self-regulate -- I don't think it could sustain itself in the meantime. Anyway, I can't think of any successful online community where the nice, quiet, reasonable voices defeat the loud, angry ones on their own." —Ruling the global masses, one image at a time. The art of moderation as practiced by Heather Champ, Director of Community at Flickr. [more inside]
Warning to chatroom users after libel award for man labelled a Nazi. "Mr Keith-Smith told the Guardian that he took action after a debate about the Iraq war in 2003 on a Yahoo! message board with about 100 members turned ugly. "She was very pro-Bush. Initially, she called me lard brain and I wasn't particularly concerned about that. Then she called me a Nazi," he said."
Shut Up! No, *You** Shut Up At ETech, Clay Shirky covered patterns of community moderation during "Shut Up! No, *You* Shut Up." Notes were taken.
The name "Firebird" was chosen by Mozilla to rename their Phoenix product. However, Firebird is also the name of a popular and long-standing open-source database project -- and the Mozilla organization was clearly aware of this naming conflict before making their decision. Some feel that such an action, within the context of the open-source community, is unfair and constitutes bad etiquette, at the least. The discussion is ongoing, but LinuxWorld reports that the Mozilla organization has deleted recent message-board comments that criticized their decision.
Moderated. Posts to message boards at the BBC are editorially filtered within broadcasting guidelines. In this 'talking point' in particular, there is a sense of deep foreboding...
"Biggest flame war of all time: Danny Boy - sentimental Irish favorite, or stupid song decried by true Celts everywhere?" A link to a discussion in another forum about how one prevents the banal from driving out the profound in online public-participation forums. (Their conclusion: ruthless and efficient moderation.)