“I wouldn’t want my worst enemy to work there”
June 21, 2019 7:51 AM   Subscribe

“They just said me and [my colleague] were very meticulous and had a lot of promise to move up to the SME position,” Speagle said, referring to the subject matter experts who make $1 more per hour in exchange for answering moderators’ questions about Facebook policy. “They said Facebook is basically shoving all of their graphic violence content to us, that they didn’t want it anymore. So they had to move more people to cover it. And that’s all that we saw, every single day.” For the first time, three former Facebook moderators in North America are breaking their nondisclosure agreements and going on the record to discuss working conditions that include squalid, bed bug infested offices, theft and abuse by managers, and developing PTSD from watching graphic violence : BODIES IN SEATS (Content warning: This story contains descriptions of violent acts against people and animals, accounts of sexual harassment and post-traumatic stress disorder, and other potentially disturbing content.)
posted by The Whelk (74 comments total) 31 users marked this as a favorite
 
I skimmed it, up to where an incident started to disturb me. There's a sidebar, "Key Findings", that summarizes it, and yet still falls under the above content warning.
posted by ZeusHumms at 7:56 AM on June 21, 2019 [2 favorites]


I read the whole thing, and I'm sorry I did. Sorrier still for the people who have to watch all that horrific stuff, in terrible conditions, for employers who do not seem to care one bit about their physical or mental health. What a terrible, terrible thing to have to do each day.
posted by xingcat at 8:09 AM on June 21, 2019 [3 favorites]


How Facebook is using AI and Machine Learning is....not this. Somehow, not this. What was maddening was when they flagged content as violating FB policy but were overruled, so that the content could remain for law enforcement to act upon but never did. People and seeing the same content over and over, that they had flagged, remain on the site.

FB should be regulated. Their foray into crypto-currency, to me, speaks of anti-trust violations. They'd control the whole chain (marketing, selling, currency). What could go wrong?

Don't build machines you cannot control.
posted by zerobyproxy at 8:14 AM on June 21, 2019 [25 favorites]


I’d they start using “AI” for this it’ll turn super racist so fucking fast, guaranteed.
posted by Artw at 8:16 AM on June 21, 2019 [30 favorites]


More and more, this work is being offshored. Guardians of the Galaxy: the unacknowledged legislators of the online world -- Content moderators do a vital job—often for a pittance (Economist, June 15, 2019) [semi-paywalled -- if you can stop the page from loading completely, you can load the content, but not the login request]

"Whether in San Francisco or Manila, their task is fundamentally the same. These are the rubbish-pickers of the internet; to most of the world, they are all but invisible."

The article also refers to Adrian Chen of Wired, who wrote The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed in 2014, at which time "the work is increasingly done in the Philippines."

[Cross-posting this from my comment in the Is it time to retire "outragefilter" as a deletion reason? thread]


I’d they start using “AI” for this it’ll turn super racist so fucking fast, guaranteed.

This is a known issue already. See Fairness and Bias Reduction in Machine Learning, a MeFi post from last year, with links to prior threads on the topic of bias being "baked" into software and AI, based on the developers' own biases.
posted by filthy light thief at 8:19 AM on June 21, 2019 [14 favorites]


Just when you think it can't get any worse.
FACEBOOK: "Hold my beer."
posted by Fizz at 8:33 AM on June 21, 2019 [8 favorites]


Trolls love to game the shit out of moderation. Replace bored humans running of a script to meet some dumb metrics with some kind of crappy algorithm or “machine learning” (basically randomly generated crappy algorithms getting the ones that seem to work picked out) and they’ll game it even worse. Actually investing in and taking seriously human moderation and giving them good tools to work with us the only thing that can work.

Or breaking up all the social media.
posted by Artw at 8:38 AM on June 21, 2019 [6 favorites]


I think what I'm most struck by is the use of contracting firms for this as a way to provide plausible deniability. Facebook could staff these positions themselves, these people are not part-time short-term contracting, they are doing a permanent long term job that will continue to need to be done. But hiring a sleazy firm keeps Zuck's hands clean and keeps employees suffering PTSD out of the Slack channel.

If the devs had to share the lunchroom with content moderators on the daily, I bet there'd be a whole less "But we just can't doooo anythiiiiiing something something connecting people algorithmcakes".
posted by soren_lorensen at 8:40 AM on June 21, 2019 [46 favorites]


From the article: nine minutes per day of “wellness” time

This is the part that actually made me react emotionally. Like, the bosses are aware that this is tough material - and yet, they're so stingy with the time employees can spend that they can't even bother giving them a round number. Even 10 minutes? That's too much to ask for?
posted by LSK at 8:40 AM on June 21, 2019 [23 favorites]


Facebook could fix this yesterday by not using contractors. The fact that they don't reveals the game.
posted by kzin602 at 8:45 AM on June 21, 2019 [21 favorites]


kzin602, then they'd have to pay them a living wage and maybe even benefits. Who has that kind of money?

Oh, right, Facebook has that kind of money.
posted by zerolives at 8:47 AM on June 21, 2019 [32 favorites]


What if moderating social media just isn't possible?
posted by Nancy Lebovitz at 8:52 AM on June 21, 2019 [6 favorites]


Then it has to be destroyed.
posted by Artw at 8:58 AM on June 21, 2019 [63 favorites]


If it isn't possible, we'll find that out once some social media companies have made real, serious efforts to moderate, instead of dragging their feet and doing the bare minimum they think they can get away with.
posted by Kutsuwamushi at 8:59 AM on June 21, 2019 [6 favorites]


What if moderating social media just isn't possible?

I'm not a rocket surgeon or a machine learning expert but I just do not see how moderation of amateur-made multimedia is possible with AI. The article posits that these positions are contract because FB sees them as temporary until it can come up with an automated solution but... how? How does a computer watch a shaky phone video of a bunch of shitheads killing a lizard and determine that it's a bunch of shitheads killing a lizard and not, like, a clip from the Blair Witch Project?
posted by soren_lorensen at 9:00 AM on June 21, 2019 [9 favorites]


Nuke it from orbit. Only way to be sure.
posted by OnTheLastCastle at 9:04 AM on June 21, 2019 [6 favorites]


With each passing day Facebook strengthens the case for its own destruction.
posted by saladin at 9:05 AM on June 21, 2019 [12 favorites]


"Facebook could fix this yesterday by not using contractors"

I'm not sure what you mean by "this". Presumably, you mean the horrible working conditions and not the graphic content, in which case, yeah, they could, but that's also kind of the industry standard. I've worked in places like this, with the bodily fluids strewn about the restrooms and the unreasonable KPIs and the triple-digit turnover rate after 30 days and the sudden deep cleanings when higher-ups visit. It sounds like every other call center in America, except, instead of talking on the phone they're watching graphic videos. Facebook hiring direct employees won't fundamentally alter Cognizant's business model. There are thousands of other clients that they'll just contract with instead. The fact that Facebook is the client is leading to a lot of outrage because everyone hates Facebook, but Cognizant would be a shitty company to work for even if the client were the most angelic company on the planet. IMO the one who needs to fix things is Cognizant. (And, of course, the lawmakers in the jurisdictions in which Cognizant operates.)
posted by kevinbelt at 9:17 AM on June 21, 2019 [7 favorites]


i think it's not ethical to use facebook anymore. there is no one particular reason why it is not ethical to use facebook anymore, but instead a vast constellation of very good reasons not to use facebook anymore.

i think i'm going to not use facebook anymore.
posted by Reclusive Novelist Thomas Pynchon at 9:26 AM on June 21, 2019 [36 favorites]


i think it's not ethical to use facebook anymore.

I came to the same conclusion a year or two ago. They're not taking seriously the things I think are very serious and I'm not interested in allowing them to monetize my eyeballs and my content any more. I have far, far too much sympathy for my colleagues who are slogging through the morass of despicable content with no institutional support and vastly inadequate compensation.
posted by restless_nomad at 9:29 AM on June 21, 2019 [14 favorites]


What if moderating social media just isn't possible?

How many moderators per active user account does Facebook have? Quick numbers I Googled up for FB are 2380000000 users / 15000 moderators = 158666 users per moderator.

How does that ratio compare to MetaFilter?
posted by pracowity at 9:29 AM on June 21, 2019 [1 favorite]


9.99/month or equivalent to use Facebook. No ads. Smaller site with fewer users is the result. People who value its abilities can prioritize it like a utility. Kids can work a summer job to pay for it, or just go outside. Ok never mind ... this has all kinds of problems. But we really should use regulation so that for profit web firms that are set up as business to consumer are also selling their product to consumers and not advertisers. It would solve a lot of problems.
posted by freecellwizard at 9:31 AM on June 21, 2019


How does that ratio compare to MetaFilter?

It's hellish. I've never had to work a forum with more than about a 1.5k/1 active-user-to-mod ratio, and those were all vastly more structured, with clear goals, clear ethical boundaries, and enough power to actually solve problems and set limits. (You could argue that Metafilter is higher because we only have one person on shift at a time, but just having other people to share the decision-making load in general is significantly different from being The Place Where the Buck Stops.) Much beyond that and you absolutely have to move to a less-nuanced, more mechanistic way of management, and you can either go super rigid and ban people all the time, or super loose and let people get away with murder, or (most commonly) both, depending on exactly what management assumes will impact their revenue the least.
posted by restless_nomad at 9:35 AM on June 21, 2019 [19 favorites]


(Also I have done a little modding on Facebook for a temp job, and their tools are so vastly inadequate that the kind of user-based community moderation that Reddit manages is functionally impossible. I am not inclined to hold Reddit up as an example of positive community management strategy, but I will concede it's a whole lot better than Facebook.)
posted by restless_nomad at 9:37 AM on June 21, 2019 [20 favorites]


What corporations like to consider the AI revolution seems to inevitably be powered at the lowest level by humans. I'm not talking about the programmers and scientists, I'm talking about the massive number of grunt workers schlepping packages in warehouses, driving taxis, delivering food, putting together electronics in sweatshops, labeling data and moderating unimaginably disturbing content by hand.

We're constantly told that a robot revolution just round the corner is going to free the masses from mindless labor, but current trends should make one skeptical of that claim. Tech has created more mindless labor than it has removed.
posted by splitpeasoup at 9:49 AM on June 21, 2019 [6 favorites]


I swear I remember skipping to the end of this article, and the last line, a quote, was "They should shut Facebook down". (It's not that. Maybe I was projecting).
posted by ZeusHumms at 9:51 AM on June 21, 2019 [3 favorites]


For those on the fence, I’ve taken the half-measure of deleting my Facebook but keeping Instagram (which I use mostly for following art and design accounts as well as family and close friends). I deleted my Instagram app so that I’ll check it less often and will keep open the option deleting it when I want to fully disengage. I highly recommend it.

FB should be regulated. Their foray into crypto-currency, to me, speaks of anti-trust violations.

I think we are beyond that. The concept of a Facebook currency brings the company ever closer to playacting as a quasi-governmental body with its own constituency. I’m curious when the currency idea developed and whether it was before or after Zuck’s 50 state tour, when it became clear he didn’t have promise as a politician.
posted by sallybrown at 9:57 AM on June 21, 2019 [5 favorites]


OK - I'd like to quit FB. (Instagram is FB, so, no, not doing instagram)

What I want is an easy to use semi-private microblogging website that uses federated logins so I don't have to manage user accounts beyond setting permissions, photo storage (tagging, comments, etc.) and a discussion forum - and this is key - decent mobile support.

I've tried to build a bunch, and frankly, the tools suck and are fiddly. You can get there, kind of, with Wix - good mobile support, and federated logins. But the community stuff is really rudimentary.

WP can do it, but then it's a matter of using plug-ins that... well, maybe or maybe not have design and support issues - and anyway, it can be so damn fiddly getting the bits to work and look right.

I don't want to be a WebMaster - I want to post my photos and my thoughts so my friends can see them and comment on them, and I can control who sees what and when.

FB does all of this stuff, and it's easy as fuck. That's their hook. Solve that, and FB will wither.
posted by Pogo_Fuzzybutt at 10:02 AM on June 21, 2019 [4 favorites]


I swear I remember skipping to the end of this article, and the last line, a quote, was "They should shut Facebook down". (It's not that. Maybe I was projecting).

I thought that, too, form having read it a bit earlier this morning and went back. It's there, but in the middle of the article:

“I really wanted to make a difference,” Speagle told me of his time working for Facebook. “I thought this would be the ultimate difference-making thing. Because it’s Facebook. But there’s no difference being made.”

I asked him what he thought needed to change.

“I think Facebook needs to shut down,” he said.
posted by The Great Big Mulp at 10:12 AM on June 21, 2019 [5 favorites]


The fact that Facebook is the client is leading to a lot of outrage because everyone hates Facebook, but Cognizant would be a shitty company to work for even if the client were the most angelic company on the planet. IMO the one who needs to fix things is Cognizant. (And, of course, the lawmakers in the jurisdictions in which Cognizant operates.)

Cognizant is shit because their clients allow them to be shit. If an "angelic" company was Cognizant's client, Cognizant would be good to work for because said client would have the equivalent of a colonscope up their ass.
posted by NoxAeternum at 10:13 AM on June 21, 2019 [8 favorites]


Tech has created more mindless labor than it has removed.
This is an under-appreciated point. Where the promise was automation, the reality became a new era of sweatshops. But these new sweatshops are mostly invisible. In the past, we didn't know much about how our shoes were made, but we knew someone somewhere was making them. Now, we're told it's all algorithms and that we shouldn't worry about it. So not only has tech created more mindless labor than it removed, the experience of that labor is functionally worse.
posted by elwoodwiles at 10:14 AM on June 21, 2019 [17 favorites]


saladin: With each passing day Facebook strengthens the case for its own destruction.

But they have groups now! You could talk with other people about your awesome garage or something? (Facebook has been pushing their groups hard, from my limited interaction with the site, as if to say "there's more here than your limited group of friends and family, who you now may be estranged from, due to their increasingly violent, racist and misogynist rants, which we fostered.")
posted by filthy light thief at 10:27 AM on June 21, 2019 [9 favorites]


It’s like those delivery robots that said they where “AI” but where actually being controlled by Colombians making 2$ an hour.

AI or machine learning as it is currently used is literally the mechanical Turk - a chess playing “robot” with a man inside-, away of disguising exploited labor. Articles like these, to reference the only movie ever made Snowpiercer, is what happens when you lift the cover open and see the people who make the machine run.
posted by The Whelk at 10:28 AM on June 21, 2019 [4 favorites]


It's hellish. I've never had to work a forum with more than about a 1.5k/1 active-user-to-mod ratio

So Facebook needs to hire about... a million and a half more moderators to just start coming close?
posted by pracowity at 10:28 AM on June 21, 2019


So Facebook needs to hire about... a million and a half more moderators to just start coming close?

The way it's currently set up, yes. I have always been of the opinion that moderation doesn't scale any better than direct democracy, and for the same reasons.
posted by restless_nomad at 10:30 AM on June 21, 2019 [5 favorites]


Right now I trust Reddit more than fucking Facebook.
posted by Artw at 10:41 AM on June 21, 2019 [10 favorites]


I applied for a job as a content moderator at Facebook itself - apparently one of those that looks at the reports that people file and decides if it violates the rules or not.

They sent me a test, and I'm pretty sure I failed it by answering the questions honestly and in consultation with my morals and ethics.
posted by mephron at 10:57 AM on June 21, 2019 [5 favorites]


Honestly I've moved my online social activity entirely to small- or micro-scale Slacks filled with people who are vetted on one axis or another. I like the idea of massive cross-subculture/cross-community connections, but in practice I need access to the tools that actual communities use to manage themselves, such as effective methods of communicating disapproval and a way to communicate agreed-on standards of behavior.
posted by restless_nomad at 11:01 AM on June 21, 2019 [2 favorites]


I feel like it was like 20 years ago that i first read an article like this - low-wage moderators traumatized by viewing horrible content under terrible working conditions
posted by thelonius at 11:07 AM on June 21, 2019 [1 favorite]


Pogo_Fuzzybutt: "I want to post my photos and my thoughts so my friends can see them and comment on them, and I can control who sees what and when.

FB does all of this stuff...
"

Are you sure about that bolded part of your statement? Because I'm definitely not. I don't trust their user-facing content access limits at all. Not one bit.

In February or so I stopped checking Facebook. Cold turkey (not that I had been a heavy user in a long, long time, but still.) In the time since, I have (a) posted one photo of my kid on the last day of school, using my iPhone (direct upload, not logging in at all) and (b) logged in one time to check on a specific event for which details were only on Facebook. I only ever log in using Facebook Container.
posted by caution live frogs at 11:30 AM on June 21, 2019 [1 favorite]


I guess the part I don't get is why/how the physical working conditions are worse than most call center jobs, even the sub-sub-subcontracted ones. It's like somehow everyone associated in any way with Facebook (seemingly to include users, though to a lesser degree maybe) turns into a fucking ghoul who celebrates their abusive behavior rather than the former standard of at least making it look like they had a small amount of concern for their employees.

So yeah, I'm all in on burning the fuckers down. (Legally, of course) I know not how, I know not why, but I can see that it, especially, in a way that sites like Reddit somehow seem not to despite their own serious problems, is destroying the society I live in.
posted by wierdo at 12:00 PM on June 21, 2019 [3 favorites]


I suspect we have Facebook moderators working in our building. And they're looking for more people who are comfortable with being exposed to "highly sensitive and adult explicit content on a daily basis" in various languages. I'll have to start watching for dead eyes in the elevators.
posted by pracowity at 12:17 PM on June 21, 2019 [1 favorite]


I guess the part I don't get is why/how the physical working conditions are worse than most call center jobs, even the sub-sub-subcontracted ones.

Generally, a company's values are reflected in the details. This includes basic facility conditions. It's not usually a conscious act, to provide terrible conditions, it's just a downstream consequence of the generalized disdain some companies have for their workers. They lease the cheapest buildings and bid out the maintenance to the absolute cheapest bidder on the most minimal contract. Spin this forward a few quarters and the facilities are in literal shambles and nothing gets repaired.
posted by elwoodwiles at 12:22 PM on June 21, 2019 [8 favorites]


Couple points to chime in on:

* Machine learning does get used in this space, but it's relatively limited in what it can accomplish. The worst-of-the-worst videos are relatively small in number, so content-id type approaches work reasonably well for keeping things like beheading videos under control. Porn detectors work reasonably well, though the choice of where to set the detection threshold is bound to annoy someone...

* And beyond that, you have The Rest of It, which you're bound to have to use humans for. The 'deeply damaging user content' problem is in no way limited to Facebook. The places that seem to work (mefi, some subreddits, etc) are smaller communities with some combination of personal investment and/or stronger moderation. But the Giant Hivemind approach to the social internet seems like a dead end at this point in history.

* Much of the web was built on ideals of free speech and removing barriers to speech. The internet as it stands today is, to me, a testament to human fragility. Capitalism mixed with gradient descent on dopamine is a dangerous cocktail.

[...]in a way that sites like Reddit somehow seem not to despite their own serious problems, is destroying the society I live in.

uh, really? Pizzagate, QAnon, gamergate, sad puppies, the donald... Twitter provides the highlights for the alt right and reddit provides the discussion board.
posted by kaibutsu at 12:28 PM on June 21, 2019


I don't think you can make people do this sort of work without fundamentally viewing them as interchangeable widgets without feelings, because if you let yourself think about them as humans, then you have to reckon with the effects on their psyches. And if they're interchangeable widgets without feelings, then who cares how nice the facilities are?

(Relatedly, one of the ways to survive this kind of job is to see yourself as an interchangeable widget without feelings. It's... not good for people in the long run.)
posted by restless_nomad at 12:31 PM on June 21, 2019 [8 favorites]


But they have groups now

Groups is my primary use case for FB these days. Several long running listervs have moved to FB unlisted or private groups.
posted by COD at 12:33 PM on June 21, 2019 [1 favorite]


What I want is an easy to use semi-private microblogging website that uses federated logins so I don't have to manage user accounts beyond setting permissions, photo storage (tagging, comments, etc.) and a discussion forum - and this is key - decent mobile support.

I've tried to build a bunch, and frankly, the tools suck and are fiddly. You can get there, kind of, with Wix - good mobile support, and federated logins. But the community stuff is really rudimentary
.

I think lots of us want this, but I can't see the business model. Dealing with federated login that's outside your company's control is lots of constant coding work when other people's ID providers break your API. You don't have personal data to sell, so you'll have to charge money, but I think that brings with it an expectation of quality customer service that is quite expensive.

I can't see how your would make a run of sick a company without charging like 10-15 USD a month minimum to host stuff, but I can't imagine that many people will pay.

Part of the issue of Facebook is that it has, like Wal-Mart and Uber, driven all its meaningful competition out of business and set the expectation that things on the Web don't cost money to accomplish, when, really, there's a lot of pieces to being secure, ethical, and fair, all of which are major expenses.
posted by thegears at 12:45 PM on June 21, 2019


I don't mean to laud Reddit, but it seems less universally damaging, perhaps because its reach isn't so broad or because it's mostly pseudonymous so people better separate their online dickbaggery from their offline interactions.

Shit comes from the chans and Reddit, but the shit they generate seems to be destroying people and their relationships on/through Facebook, not elsewhere on the Internet.
posted by wierdo at 1:43 PM on June 21, 2019 [2 favorites]


Much of the web was built on ideals of free speech and removing barriers to speech. The internet as it stands today is, to me, a testament to human fragility. Capitalism mixed with gradient descent on dopamine is a dangerous cocktail.

At this point, I think that free speech absolutism has been thoroughly disproven. The problem is that, even with the evidence so clearly stacked against it, there are still people who are trying to sell it.
posted by NoxAeternum at 2:03 PM on June 21, 2019 [3 favorites]


I can't stomach reading past the iguana story, and I don't even consider myself particularly pro-animal rights more than the next person. This just makes me so sad for the people who feel this is their only way to make a living. And it is yet another reason that I am baffled that FB is something people continue to use. I deleted my profile two years ago. I still like following people on IG and enjoy the photography and mild humor. I really love the 5 or 6 sub-Reddits I subscribe to, and the r/divorce subreddit saved me this past year when I went through it. But all I really saw on FB was that I felt angrier after using it. It was where my ex's online affair started (and many other people's from what I can see). It was where I often found myself trying to not get into meaningless political arguments. It was where my worst parts - envy or pride mostly - were laid bare. What finally got me out of there was my upcoming birthday and the cavalcade of pointless birthday wishes from people I haven't spoken to in decades and certainly never will. I have felt such peace from quitting FB to the extent that even seeing the color scheme is sort of a trigger (and I hate that word). I know for some people it is a place of comfort and utility still, and this is only my experience.

I do not know what the endgame is for FB and could not even try to predict, but it is wholly different and more malevolent than any other social media platform in existence.
posted by docpops at 2:19 PM on June 21, 2019 [5 favorites]


Well, I mean, Reddit does have good mechanisms for moderating, but it's only as good as the moderators...for some time (and likely still) /r/NorthDakota was moderated by one/more/all white supremacists. If you slip up and let one through, they elevate their friends' rights, and then they're in charge, which is how it pretty much works in any tiered 'administrative' capacity (see also Sheriff's departments, school boards, etc.)
posted by AzraelBrown at 2:20 PM on June 21, 2019 [4 favorites]


This is not exactly a new trauma in the online age. I'll just repost my comment about doing this kind of work in 1999 and how it affected me.
posted by nikaspark at 2:23 PM on June 21, 2019 [12 favorites]


Nationalizing facebook, breaking it apart, and indicting every post-2016 member of their c-suite for the monster they built? Most folks aren't single-issue voters but wouldn't it be lovely to see that become a 2020 campaign focus…
posted by Haere at 2:39 PM on June 21, 2019 [3 favorites]


i think it's not ethical to use facebook anymore. ...
i think i'm going to not use facebook anymore.


I would love to use the FaceBook that everyone thinks is FaceBook.
This actual one, not so much.
I would burn it with a flamethrower, but people keep post things that I don't know about otherwise, so I either have no social life, or I look stupid because I don't know what's going on.

So if I post this to my FaceBook page will it stand?
posted by BlueHorse at 3:12 PM on June 21, 2019 [1 favorite]


I didn’t know that iguanas could scream before I read this story. Thanks, things I never wanted to know.

I don’t understand why the moderation system isn’t more sophisticated. Aside from the system just being completely inadequate, I don’t understand why the underlying code of a banned video (or one that has already been processed and flagged, or allowed to stay because Law Enforcement) hasn’t been analyzed, so that if the same video comes up again in the queue, it isn’t flagged as already being moderated. It should be. It doesn’t seem like this should be hard to do at all.

Am I vastly underestimating the difficulty of such a system? Even using current moderation setups (with human moderators, etc) it doesn’t seem like anybody should ever have to watch any of these videos more than once.
posted by verbminx at 4:06 PM on June 21, 2019 [1 favorite]


So Facebook needs to hire about... a million and a half more moderators to just start coming close?

The way it's currently set up, yes


So going with a million and a half full time employees at the median US full time salary that’s about ... 52 and a half million? Which is a few million over what Zuck has paid for in personal property buying.

So it’s possible to meet demand using the personal finances of the CEO.

But as the article points out, they had an uncontrollable glut of “fake organ harvesting from living, awake children.” Friend of mine was talking abut her mother, who has basically lived the same rural community all her life and is a sweet and pleasant person - and she gets shared some of the worst, vile, and uttering baffling posts from her neighbors and relations. She’s taught her mother how to use tineye and right wing watch and such but her mom is just baffled, why would people show her these awful things, things that aren’t even true, it’s so completely outside her experience that she can’t image why anyone would do that.

I don’t know where you go from that and considering the demographics of Facebook, how is this not a form of elder abuse?
posted by The Whelk at 4:48 PM on June 21, 2019 [8 favorites]


I was also confused why videos have to stay up for law enforcement? Are they hoping criminals will say something incriminating?
posted by smelendez at 6:24 PM on June 21, 2019 [1 favorite]


Am I vastly underestimating the difficulty of such a system? Even using current moderation setups (with human moderators, etc) it doesn’t seem like anybody should ever have to watch any of these videos more than once.

The thing to remember with most tech companies is that the primary resource constraint is, for lack of a better term, giving a shit. By which I mean, there's plenty of resources for doing business-led things, and business-y things, and there's usually some leeway they grant employees they value (i.e. developers, architects, etc.) because it's a hot market for those roles and they know being able to feel like they're doing things which matter is one of the few currencies with much scarcity.

So when you see a tech company with a feature somewhere which looks a bit more humane than expected, and there isn't an immediate business value to it, that's someone who spent time/effort/capital to get that to happen. Because if there isn't a business case for it, it's not going to happen on its own.

But the problem is that this isn't a system which creates humane ends. Relying on notoriously difficult to organize developers to push against the tide and make things better leads to some features which could be done individually and burnout on a broader scale. And when someone cares on one thing, odds are good they care on more things. But one developer can't simultaneously be the force for accessibility, and security, and not holding archaic views on gender identity, and...

So things which would take larger scale work requires larger-scale caring, in a system which discourages organizing about caring about things too much. So while there's not really a good technical reason why moderators should have to view the same video repeatedly, there's a prioritization/social cost issue. (and the old saw about the tinker's children having no shoes applies even moreso in the land of contractor work.)

As the business sees it, spending dev-time is a limited resource. There's only so many of those you can hire, and only so much they can work on in a quarter, and everything has to be measured against whatever could be making the most business value.
But contractor-time? That's linear-cost. You can just keep throwing bodies at the problem.
posted by CrystalDave at 6:36 PM on June 21, 2019 [6 favorites]


Hiring one company-payroll resident inspector for each contractor facility wouldn't actually cost that much money. Can't pull off the dog and pony show every day. (And if they do, is that a win?) It works for oil rigs and it works for the NRC. although the oil rig "company man" might be there to pressure them to be more terrible

You'd have to rotate them enough to not get comfortable and buddy-buddy with the locals and start taking kickbacks, though.
posted by ctmf at 8:21 PM on June 21, 2019


Christ, this is the probably the most depraved thing Capitalism has yet wrought, even when you include child- and prison-labor.

You want to eat? You want a roof over your head? We’ll pay you just barely enough to get those things in the actual butthole of America ($15/hr is minimum wage where I live) if you’ll watch videos of puppies getting their skulls bashed in and children being raped because humans are awful and whatnot.

WE’ll PAY YOU MINIMUM FUCKING WAGE TO SELL YOUR SANITY AND CONSCIENCE SO WE CAN KEEP MAKING BILLIONS UNTIL YOU ARE SHELL SHOCKED HUSK OF A HUMAN

Then, fuck you.

It’s way past time to close my Facebook account. Thanks for the push, Whelk.
posted by Slarty Bartfast at 8:32 PM on June 21, 2019 [6 favorites]


More further reading: Sarah Roberts has been working on this subject for a while and has a book with Yale Press shipping soon.
posted by ahundredjarsofsky at 10:18 PM on June 21, 2019 [1 favorite]


So going with a million and a half full time employees at the median US full time salary that’s about ... 52 and a half million? Which is a few million over what Zuck has paid for in personal property buying.

It's around 50 billion, with B.
posted by mark k at 10:25 PM on June 21, 2019 [1 favorite]


Hiring one company-payroll resident inspector for each contractor facility wouldn't actually cost that much money.

When I worked at a place that Google outsourced their support to, I was picked to go out to lunch with some of the Google reps when they visited. (You would think in the Seattle area they'd have someone, but actually, no.) Turns out that my lack of drinking alcohol made them apparently think they had to make up for me in the boozing department, so I heard some things. Like:

-- their contract didn't allow a Google person on site except with pre-arranged times, and all other contacts had to be by phone or video-chat.
-- the company originally wanted to pay us $12/hr, and Google pushed them to make it $14/hr
-- Google also pressed them into an incentive pay program (if our three scores in customer service questionaire, quality and 'adherence to schedule' were above a certain point we'd get more money depending on how many we hit)
-- the main reason they were taking people to lunch was to find out what the company was fucking up, because there was no fucking way the money they gave for the Employee Relaxation Area was being used properly. (Because 6 plastic tables from staples, 36 folding chairs, 4 rented vending machines, and one microwave, along with a TV and some devices that Google sent for demos didn't match up to what their plans were...)

Long story short, I got laid off the next month (a month after my supposedly mandatory yearly review after 10 weeks of hitting all three incentive scores, because "my skill base wasn't wide enough" (aka I was taking calls like I should have when they decided to have some training classes on other things, and they didn't bother checking beforehand). About nine months later, they lost the contract because they laid off so many of their people who were hitting the marks that the entire place pretty much wasn't hitting any of the Google-required targets and also lots of questions about the $5 million for the Employee Relaxation Area that they were given.

And the company apparently no longer exists anymore.

Anyway, the point I was getting to is that in some of these places, they have contracts keeping an employed-by-the-employer-company off site to keep an eye on things. Deliberately, so they can get away with shit like this and only have to prep for the dog-and-pony show.
posted by mephron at 3:35 AM on June 22, 2019 [8 favorites]


Also came in to recommend Behind the Screen - it ships Monday. Roberts is a fantastic scholar and she's been doing work in this space for a really long time. She also advised the making of the film The Cleaners, a documentary that debuted at Sundance about content moderation; it's available streaming online.
posted by sockermom at 3:54 AM on June 22, 2019


CrystalDave, that was a depressingly accurate description of a lot of my work experience (at other places, not here.)
posted by restless_nomad at 6:10 AM on June 22, 2019 [1 favorite]


Anyway, the point I was getting to is that in some of these places, they have contracts keeping an employed-by-the-employer-company off site to keep an eye on things. Deliberately, so they can get away with shit like this and only have to prep for the dog-and-pony show.

Then, in short, Google fucked up. They should have had it explicitly spelled out in the contract that there would be X number of Google employees on site, and that they could drop in whenever they liked. And if the firm balked, well...there's always someone else.
posted by NoxAeternum at 11:03 AM on June 24, 2019 [1 favorite]


Where are these horrific videos on Facebook? I suppose I am fortunate that I've never come across any content like this. Maybe the people in the content management teams get it before I see it? It sounds worse than anything else I've ever seen on the internet, to be honest. I'd be very surprised to see it on FB.
posted by theorique at 6:21 PM on June 24, 2019


You don't see this kind of content on Facebook because content moderators remove it before you see it. That is the very premise of the book being discussed. These kind of questions are willfully obtuse, and fall under the rubric of reply guy 3 (gaslighter: "is it really that bad?") and reply guy 6 (sealioner: just asking questions!) and they definitely do not contribute anything substantive to discussion.
posted by sockermom at 9:19 PM on June 24, 2019 [2 favorites]


Sorry, I thought the FPP was about a recently-released book mentioned upthread; it's about a topically related article. The point remains that this otherwise substantive discussion is not enriched by that kind of low-level trolling (for lack of a better term).

Anyhow, to bring the discussion back to other relevant work in this domain, here's a really wonderful article of Roberts's discussing the logic of opacity and secrecy in modding:
Digital detritus: 'Error' and the logic of opacity in social media content moderation. It's open access. She makes some really interesting arguments in this piece (some of which are directly relevant to moderation on Metafilter, in fact).
posted by sockermom at 9:33 PM on June 24, 2019


In addition to content moderators removing objectionable videos before you can see them (which doesn't always actually happen, per the article), one reason you wouldn't see them is if you're not friends with sociopaths on FB. Most of my friends are in their late 30s and use FB to share pictures of their kids; I almost never see anything objectionable. But that would probably be different if my friends were in their early 20s, or if they interested in things like like white nationalism, incels, radical Islam (remember the ISIS beheading videos?), "Jackass"-type stunts, etc.

With all due respect, that comment was pretty willfully naive. It's the online equivalent of "everything's nice in my quiet suburb, so it must be nice everywhere".
posted by kevinbelt at 3:49 AM on June 25, 2019 [3 favorites]


Google fucked up. They should have had it explicitly spelled out in the contract that there would be X number of Google employees on site, and that they could drop in whenever they liked.

I would be exceedingly skeptical of any large company asserting unlimited, no-announcement audit rights of a supplier. That is so open to abuse.

I'm not defending big tech. Quite the opposite: They should be accountable for conditions like this, but holding them accountable should not give them an excuse to wield even more power over other companies.
posted by mark k at 7:35 AM on June 25, 2019


I would be exceedingly skeptical of any large company asserting unlimited, no-announcement audit rights of a supplier. That is so open to abuse.

Given that the current system gives us collapsing sweatshops in Bangladesh killing hundreds and poorly paid workers having their souls broken by the worst the Internet has to offer, maybe we should try it the other way. Also, I've grown tired of using nebulous assertions of "abuse" as an argument against pushing for regulation and accountability.
posted by NoxAeternum at 7:48 AM on June 25, 2019 [2 favorites]


I don't mean un-announced visitation rights. I mean living there. Office in the building. That might be a bit harder to get in the contract, but it actually benefits both parties by making communications really easy.

Unless you want to deny knowing how horrible the contractor is, then it's not very convenient.
posted by ctmf at 10:29 AM on June 25, 2019 [1 favorite]


I think people are missing the point of my comment. I'm not disbelieving that this content is there, more expressing that it must have an incredibly short half-life (due to people reporting it, and these contractors, presumably). I used to have a habit of trying to find the most repulsive content on the internet (in my 20s, naturally) and the verbal descriptions of some of this stuff exceeds even the worst of the things that I remember seeing. I have some oddball Facebook friends and would expect to see some highly offensive, edgy content now and then but ... nothing. Guess these contractors are doing their jobs well.
posted by theorique at 10:41 AM on July 4, 2019 [1 favorite]


« Older Harry Potter: Wizards Unite   |   Round of 16 Newer »


This thread has been archived and is closed to new comments