insufficient context, scale, frequency or scope
February 9, 2015 12:25 AM   Subscribe

"Instead, most current systems, almost without fail, do the opposite. Moderators responsible for content and complaints, regardless of gender, are making decisions based not just on the information they are reviewing, but on the way in which the information flows – linear, acontextual and isolated from other incidents. They are reliant, despite their best efforts, on technical systems that provide insufficient context, scale, frequency or scope. In addition, they lack specific training in trauma (their own or users) and in understanding gender-based violence. " -- "Silicon Valley sexism: why it matters that the internet is made by men, for men", by Soraya Chemaly, The New Statesman
posted by joseph conrad is fully awesome (25 comments total) 22 users marked this as a favorite
 
A lot of fantastic links in that article, thank you for sharing it here.
posted by iamkimiam at 1:41 AM on February 9, 2015 [2 favorites]


It is a great writeup with plenty of context.

My only quibble is with the last sentence: In the meantime, however, we have lost a generation of women’s innovative potential to a fully integrated, socially cultivated, self-perpetuating misogyny all suited up in progressive ingenuity.

Nah, we haven't lost it. We're still here, we've still contributed, I can see women's influence everywhere. Blogs were originally derided as "women's journals", until men started writing them. Captioned pics of cats (I wrote the first, hi - Malo the Cat and his blog, 2002) were similarly derided until men apparently invented them. Et cetera and so forth.

Indeed, we'll be able to contribute a heck of a lot more with recognition and support sans constant, wearying harassment and derision.
posted by fraula at 2:08 AM on February 9, 2015 [28 favorites]


This reminds me of a persistent troll we had in one of the forums I moderated. I maintained a thread in the Mod Area detailing his usual appearance, his known names, and his known IPs; we'd hit the point with him where he didn't need to do anything in order to get his account deleted - he'd blown through some ten or fifteen second chances by then. There was no automatic way to set up this sort of tracking for a persistent troll, though; I had to manually do everything, and I was the only of a handful of mods who managed the forum who did the work.

More and more evidence seems to point to a subset of humanity (5-15%) who cause problems and do so in a sustained manner. It's odd how this isn't recognized on a structural level - indeed, on a structural level there seems to be a concerted effort to wipe peoples' histories and give them fifteenth and twentieth chances.
posted by Deoridhe at 2:27 AM on February 9, 2015 [11 favorites]


Persistent tracking of specific problem users is an interesting problem on platforms that support some level of anonymity. I don't think it's intractable--software can analyze things like word frequency patterns, IP addresses, and post times to give pretty good guesses as to who you've seen before. Probably the most expensive part would be setting up some kind of public database of this information so that you don't have to train your particular forum to recognize some particular troll. But that database could totally be pay-to-use. Perhaps the service would be bundled with your web hosting subscription.

Earth to Bezos...
posted by LogicalDash at 4:17 AM on February 9, 2015


A public database of trolls, based on pretty good guesses? What could possibly go wrong?
posted by Segundus at 4:38 AM on February 9, 2015 [10 favorites]


More and more evidence seems to point to a subset of humanity (5-15%) who cause problems and do so in a sustained manner.

I think that it would be useful and informative to provide a link or reference to this evidence.
posted by polymodus at 4:38 AM on February 9, 2015


I think troll-identifying software would suffer from the same problems as terrorist-identifying software - flooded by false positives.

But maybe some spaces would be ok with that... if it quacks like a duck it doesn't actually matter if it's the particular duck you've seen before or not.

But I'd go with this before I started messing around with word frequency. I think the installed fonts, especially, are what uniquely identify my system.
posted by Leon at 4:39 AM on February 9, 2015


It's not like there's any objective standard by which you can distinguish the troll from the mere asshat. There are going to be judgment calls, and the incipient false positives, no matter what you do. With a public database, there is at least a record of that, which anyone can audit, and to which you can submit bug reports and appeals.
posted by LogicalDash at 4:43 AM on February 9, 2015


One thousand white male voices cry out in pain, a plaintive chorus:
"but, meritocracy! "
posted by clvrmnky at 4:50 AM on February 9, 2015 [14 favorites]


But I'd go with this before I started messing around with word frequency.

It would be trivial to set up a browser that would fall into the cracks -- esp. with that page telling you what to change.

With a public database, there is at least a record of that, which anyone can audit, and to which you can submit bug reports and appeals.

And that'll be the first thing the trolls troll.

I would suggest looking at the efforts to fight spam for both inspiration and caution -- that's something with a very similar problem set.
posted by eriko at 5:06 AM on February 9, 2015 [3 favorites]


Really excellent analysis of how this has come to be.
Guidelines speak to a salient issue, namely, many companies are spending a great deal of time employing people, most frequently women, to work on community management and customer service, divorced – functionally, spatially, culturally, hierarchically – from systems engineers and senior team management. Moderation systems are overtaxed because of inadequately informed technology tools and business cultures.

Somebody needs to send this article to Guardian, it could help them in their efforts to build an online commu -
Woops, too late!
posted by glasseyes at 5:21 AM on February 9, 2015 [1 favorite]


And that'll be the first thing the trolls troll.

Um, what of it? GamerGators are already fucking with the established reporting tools. With a central repository of troll profiles, that hassle gets "socialized"--people get employed by the repository to separate out the legit reports from the bogus ones. It's a business, you know, like Spamhaus. And like adblockers, individual site admins get to decide what to do with all this information, based on fairly good knowledge of how it's been collected and verified.
posted by LogicalDash at 5:29 AM on February 9, 2015


It would be trivial to set up a browser that would fall into the cracks -- esp. with that page telling you what to change.

Yeah... if this was a problem I had, I wouldn't be telling people I was ID'ing them by font, and I certainly wouldn't be publishing their fingerprint. I'd just be very quietly blackholing them.
posted by Leon at 5:40 AM on February 9, 2015 [1 favorite]


It's not like there's any objective standard by which you can distinguish the troll from the mere asshat.

This falls into 'feature, not bug' territory for me - I'd be perfectly happy for a system to throw out the asshats with the trolls.

When it's the quacking that's the problem, if it quacks like a duck, who cares if it's a duck?
posted by Dysk at 5:41 AM on February 9, 2015 [7 favorites]


Facebook, oddly enough, is on the right general path here, allowing users to define who their friends contacts are, while controlling what information appears to friends of friends. It's not perfect and probably never will be, but putting the power in the user's hand to define who can contact and see their info is a good first step.
posted by Brandon Blatcher at 5:50 AM on February 9, 2015


This falls into 'feature, not bug' territory for me - I'd be perfectly happy for a system to throw out the asshats with the trolls.

When it's the quacking that's the problem, if it quacks like a duck, who cares if it's a duck?


Periodically I have been perplexed at the patience and willingness to continue giving seemingly endless second chances by moderators here. There are occasionally users who are (to my biased eyes, at least) so thoroughly disruptive that waiting until there is clear evidence of bad intentions seems like just an enabling tool, and there are a few people who appear to have learned how to skirt that line with great precision.

But I'm not in the position of having to make those decisions, nor am I in the position of what the article describes, choosing how to to resource and direct moderation versus systems, for example. It's definitely a situation where you can get poor outcomes from what are basically a well-intentioned set of much smaller decisions.
posted by Dip Flash at 5:51 AM on February 9, 2015 [3 favorites]


So many of these problems -- separation of content/community from dev/IT and management -- are propped up entirely by the silos that exist in companies: departments that report to separate P&Ls.

And this all exists because of the need to move up some imaginary ladder, thus there must be managers, and managers must be slotted, and if managers must be slotted, then so must the people they manage.

To this way of thinking, everything is separate, and cross-functionality and collaboration are solved by open work spaces.

It would be great -- I would be relieved -- if we could pin this all on sexism. At least that's a clearly identifiable dragon that we have some hope of slaying.

But it's a "people are not very smart, don't want to have to think too hard, and like the illusion of control and predictability" problem. Else, why would we see multiple tiny iterations on the same ideas and products roll out every damn day? There's a reason why "It's Uber for [product/problem]" is a meme.
posted by gsh at 6:02 AM on February 9, 2015 [5 favorites]


However, if these systemic biases are not addressed that gender gap will continue to grow,

We are barely at the point where the biases are acknowledged as systemic. We're getting there, but jesus, it's slow.
posted by rtha at 6:14 AM on February 9, 2015 [9 favorites]


Facebook, oddly enough, is on the right general path here, allowing users to define who their friends contacts are, while controlling what information appears to friends of friends. It's not perfect and probably never will be, but putting the power in the user's hand to define who can contact and see their info is a good first step.

Yes, but it's a bare minimum start. Finding those privacy settings is generally a pain, figuring out what they mean even more so, and staying on top of them as they change every few months (generally toward less privacy) an order of magnitude worse. And Facebook has historically been horrible at dealing with complaints about abuse.
posted by jaguar at 7:12 AM on February 9, 2015 [1 favorite]


And I would like to be able to participate online in spaces that aren't limited to just pre-vetted friends. I love Facebook, but it's kind of the internet version of "Women are the angels of the house while men control public discourse" concept right now.
posted by jaguar at 7:15 AM on February 9, 2015 [4 favorites]


(I meant to include a via last night but was too tired - I found the New Statesman link via the Twitter feed of Wagatwe Wanjuki - check her out: @wagatwe)
posted by joseph conrad is fully awesome at 8:00 AM on February 9, 2015


Anyone else remember what it was like to work at startups or new tech companies in the nineties? The ones I worked at employed a number of women, and most of them weren't programmers. We had tech writers, UX designers, software testers, artists, program managers. Now, many of those roles were also occupied by men, but the non-dev roles in software companies back then tended to be where you'd find the women. At one point we had a tech writing team of about 10 people, and half were women.

My experience has been that the full-time permanent non-dev roles in software have been greatly reduced. With the elimination of those roles went a lot of the influential women, I think.

I'm wondering how the requirement of a C.S. degree for just about any software job these days (even most testing jobs) operates as a functional gender selector. "Women Who Code" and other groups like that are great, but not all of us are cut out for, or want to be, full-time programmers. My core talent is communication: I took up testing when I couldn't find a good tech writing gig (after years of full employment as a tech writer, BTW).

I think that over the last 20 years ,we've lost all kinds of diversity in software development, not just gender.
posted by Sheydem-tants at 9:01 AM on February 9, 2015 [9 favorites]


Sheydem-tants this 1990 Harvard Business Review article explains the rationale behind what you describe:

The Core Competence of the Corporation.

The unmoved movers eat this shit up. The end point is the visionary genius on top is the only employee your company needs. Every other job can be contracted out.
posted by bukvich at 9:11 AM on February 9, 2015 [2 favorites]


Another issue I see (that I did not notice listed in the article but may have missed): Men are often more technically savvy than women. This also empowers them to stalk and harass online with less risk of getting caught. In some mostly female environments, I am some sort of computer god and people ooh and aah. In male dominated circles, I am more frequently the tech dummy.
posted by Michele in California at 12:12 PM on February 9, 2015


Men are often more technically savvy than women.

Not sure about this sweeping statement. Not sure at all.
posted by joseph conrad is fully awesome at 7:16 AM on February 10, 2015 [4 favorites]


« Older Director/filmmaker Mamoru Oshii interviewed at...   |   Tech Behind Bars: Inside the prison system’s... Newer »


This thread has been archived and is closed to new comments