"[I]f they don’t violate our policies, they’ll remain on our site."
June 4, 2019 7:37 PM   Subscribe

Carlos Maza, a Hispanic writer and video host for Vox, has been the target of a racist and homophobic harassment campaign led by right wing commentator Steven Crowder, with Crowder not only posting videos targeting Maza and calling him all sorts of slurs, but also having his followers harass Maza with calls and texts. Finally, Maza released a Twitter thread detailing the abuse he was receiving at the hands of Crowder, which got the attention of YouTube's staff.

But, after a few days, YouTube responded with a decision that in YouTube's eyes, Crowder's videos were not harassment, and would stay on the site, in seeming contradiction to YouTube's posted harassment policies.

Maza, unsurprisingly, responded with anger over the decision, and other leftist YouTube creators have also called out YouTube's decision. Maza's employer, Vox Media, has released a statement in support of him:
At Vox Media, we have embraced partnering with other organizations to bring our work to as broad an audience as possible. We believe in the advantages of a free and open web that allows people to find their voices online. We share this belief with YouTube, and have spent years creating incredible work on the platform and growing loyal, engaged audiences across the YouTube community.

But the platform and the system now appears to be broken in some ways that we can’t tolerate. By refusing to take a stand on hate speech, they allow the worst of their communities to hide behind cries of “free speech” and “fake news” all while increasingly targeting people with the most offensive and odious harassment. They encourage their fans to follow suit and we now see our reporters and creators consistently targeted by the worst abuse online, with no remedy or ability to protect themselves.

YouTube knows this is a problem – it’s developed anti-harassment policies to hold its creators accountable and remove them from the platform when they are in violation. Yet YouTube is not enforcing the policies and are not removing known and identified users who employ hate speech tactics. By tacitly looking the other way, it encourages this behavior and contributes to a society more divided and more radicalized.

YouTube must do better and must enforce their own policies and remove creators who promote hate.
(The title of the thread comes from YouTube's response to Maza, and their rationale for allowing Crowder to remain on YouTube.)
posted by NoxAeternum (158 comments total) 55 users marked this as a favorite
 
I've been following this on Twitter. I wasn't familiar with Maza until seeing this there. But this does not seem to be an edge case. It's hate speech/harassment cut and dried.
posted by DirtyOldTown at 8:10 PM on June 4, 2019 [15 favorites]


This is so frustrating. I've been following this -- I like Maza's work and I read a lot of Vox -- and I'm really upset by the (lack of) resolution.

My children aren't allowed to watch YouTube anymore, and this just confirms for me that that was the right decision. They can make all the "YouTube Kids" apps they want, but they're still not vetting the videos that go on it, and I strenuously object to my children being used to make money for a company that tolerates bigotry and harassment.
posted by Eyebrows McGee at 8:19 PM on June 4, 2019 [76 favorites]


In other news: On YouTube’s Digital Playground, an Open Gate for Pedophiles (Max Fisher and Amanda Taub, NYTimes)
Christiane C. didn’t think anything of it when her 10-year-old daughter and a friend uploaded a video of themselves playing in a backyard pool.

“The video is innocent, it’s not a big deal,” said Christiane, who lives in a Rio de Janeiro suburb.

A few days later, her daughter shared exciting news: The video had thousands of views. Before long, it had ticked up to 400,000 — a staggering number for a video of a child in a two-piece bathing suit with her friend.

“I saw the video again and I got scared by the number of views,” Christiane said.

She had reason to be.

YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found.

YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.

The result was a catalog of videos that experts say sexualizes children.
posted by Johnny Wallflower at 8:28 PM on June 4, 2019 [49 favorites]


Yesterday I clicked on the 50th anniversary of Pride google doodle, that led me to the google search on "Celebrating Pride" and several of the top results were hate-bait from Fox News and various right wind media outlets. Barf. So, of course, I went and searched "Celebrating Pride" on youtube and found that a Ben Shapiro hate-rant ranks higher than an Ellen DeGeneres video. Double barf.

In addition to maximizing clicks and eyeballs with conspiracy theories, hate speech, and outrage peddling, there's this NY Times story about the Youtube algorithm working as designed: On YouTube’s Digital Playground, an Open Gate for Pedophiles.

Summed up by Zeynep Tufekci on twitter|threadreader: Congratulations to all executives software developers at @YouTube. Your recommender algorithm has figured out how to algorithmically curate sexualized videos of children and to progressively recommend them people who watched other erotic content.
posted by peeedro at 8:28 PM on June 4, 2019 [46 favorites]


If this doesn't violate your policies, then your policies are wrong.
posted by axiom at 8:41 PM on June 4, 2019 [94 favorites]


The YouTube brand is certainly taking a beating. Nothing quite like having the enforcement of your policies on harassment questioned and the conversation immediately turning to a recommendation algorithm that facilitates pedophilia.

See also: The nightmare videos of childrens' YouTube — and what's wrong with the internet today | James Bridle
"If you have small children, keep them the hell away from YouTube"
posted by krisjohn at 8:43 PM on June 4, 2019 [13 favorites]


The way that @shaun_jen managed to trick YT into paying attention is amazing. Also wildly fucked up that they need to be tricked into responding to this stuff.
posted by chappell, ambrose at 8:47 PM on June 4, 2019 [24 favorites]


gotta phrase it in a way that suggests ad revenue is at risk
posted by ryanrs at 8:52 PM on June 4, 2019 [17 favorites]


Love that youtube is doing this while having a rainbow logo. Real solidarity there.
posted by dinty_moore at 8:55 PM on June 4, 2019 [25 favorites]


gotta phrase it in a way that suggests ad revenue is at risk

Who are the biggest advertisers on YouTube? Because that seems to be the only pain point they might listen to.
posted by nubs at 8:57 PM on June 4, 2019 [4 favorites]


How YouTube Became a Breeding Ground for a Diabolical Lizard Cult

As much I like and enjoy Leftube it is a fraction of a fraction compared to the much more popular shit like this or insane alt-right videos they recommend to you to keep you engaged.
posted by The Whelk at 8:59 PM on June 4, 2019 [12 favorites]


By far the most frustrating thing about this is that Youtube's "the videos as posted don’t violate our policies" is so obviously a lie. The videos are so clearly against their policies that @TeamYoutube couldn't even avoid using the same language that appears on their policy pages in the tweets. This decision is Youtube clearly and explicitly telling right-wing hatemongers that the service's rules and policies will not be enforced against them. This will absolutely lead to an intensification of abuse, and there is no reasonable way to argue that that wasn't the intent here.
posted by IAmUnaware at 9:03 PM on June 4, 2019 [67 favorites]


I don't think it's parents using it as a babysitter that's the problem. The problem is it's YouTube. It's everywhere. I have six phones/tablets/TVs/desktops/etc. in my room right now that can access it. You're not going to stop kids from getting on YouTube. If not at your house at their friend's houses and at school and so on. Kids are gonna use it This is a problem that YouTube needs to deal with and fix.
posted by downtohisturtles at 9:17 PM on June 4, 2019 [44 favorites]


My dealings with YouTube are nearly entirely things I've searched for or that have found posted as part of an article someplace. I have autoplay turned off. I might peruse the "related videos" or whatever the sidebar is called, but it's rare that I click on anything on it to watch.

I guess I use YouTube in a very different way from everyone else because I have no problems with it. It's just a storage place for videos I find through other online perusing or things I've deliberately searched for.
posted by hippybear at 9:24 PM on June 4, 2019 [7 favorites]


But come on now, if a parent is using YouTube as a cheap ass baby sitter, what the hell were they expecting?

People that can’t afford a real babysitter deserve that their kids see inappropriate content because of entirely voluntary and cynical choices made by one of the world’s richest corporations, got it.
posted by chappell, ambrose at 9:29 PM on June 4, 2019 [130 favorites]


That said, about my own personal use of YouTube, I'm not a public person doing things for my job that involve posting to YouTube and having to interact with it as a group of random international people at every minute of every hour and who would use the platform back against me in horrible ways.

I'm outraged but I'm not surprised.
posted by hippybear at 9:29 PM on June 4, 2019 [2 favorites]


But come on now, if a parent is using YouTube as a cheap ass baby sitter, what the hell were they expecting?

I mean, even if you're sitting right there, it has never been sane/reasonable to expect that parents will pre-consume every bit of media their small children are exposed to, first. TV shows aimed at young kids might have been of questionable value, but the places that made them were motivated to not do things parents would object to, because there would be consequences for that. The other mentioned issues in the FPP and in the comments here prove why Youtube is different on this score: Even if you're right there when your kid is watching it, if you go to look for kid-friendly Youtube stuff, Youtube is going to be actively trying to push you into unacceptable videos, and there's nothing you can do about it.

If Youtube actually dealt with issues when they came up when parents observed them and were actively fighting to fix these issues with the platform, I might say that the problem is that parents need to pay more attention. But, like, I'm an adult who is more than capable of monitoring my own Youtube usage and my recommendations are still hell and require constant pruning. The Algorithm is sure that my consumption of any gaming content means that I definitely must want alt-right videos next. If you get to the point where you have something the size of Youtube and what you're really saying is "this is unsafe for children, even with supervision", then that is as much Youtube's problem as anybody's, for continuing to promote children's content when use by children is unsafe.
posted by Sequence at 9:30 PM on June 4, 2019 [39 favorites]


There is more video uploaded to YouTube every single day than a parent could watch and vet in a year. The burden should not be on parents to do the gatekeeping, it is beyond their capacity in any rational world.

But once again, SV companies have decided that there are the policies they have, and then there are the policies that exist for important demographics. Why they have constructed their business models around enabling hatred and engagement for the sake of engagement? That I cannot say.
posted by fifteen schnitzengruben is my limit at 9:35 PM on June 4, 2019 [6 favorites]


Mod note: The kids vs. youtube content thing, while totally a thing, isn't really the fundamental thing this post is about; let's rerail from that maybe.
posted by cortex (staff) at 9:39 PM on June 4, 2019 [17 favorites]


Why they have constructed their business models around enabling hatred and engagement for the sake of engagement? That I cannot say.

Oh, I can. Google's early motto was "Don't Be Evil."

The problem is, the household spirits, they can't hear negating words. You say "Don't Be Evil", all they hear is "Be Evil". Wishing for something not to happen is the same as wishing for it to happen. "Don't Be Injured" is never the prayer. "Please Be Unharmed" is the prayer. And so on.

Don't Be Evil.

Wrong way to phrase that, and this is the result.
posted by hippybear at 9:40 PM on June 4, 2019 [30 favorites]


btw if you are a parent and want episodes of old favorites like mr. rogers and sesame street, or newer kids shows like peppa pig, just memail me and I will hook you up
posted by ryanrs at 10:02 PM on June 4, 2019 [2 favorites]


Mod note: Couple things removed. I imagine folks are just kind thinking out loud about how this could get some more positive traction, but let's try hard not wander into "well why didn't they deal with corporate-enabled systemic abuse the way I think they ought to" territory on this, that's got kind of an unsavory feel to it that I'd just rather we skip entirely.
posted by cortex (staff) at 10:03 PM on June 4, 2019 [8 favorites]


I guess I use YouTube in a very different way from everyone else because I have no problems with it. It's just a storage place for videos I find through other online perusing or things I've deliberately searched for.

I too have found much happy weirdness on YouTube, and love its little corners and the way it lets me pretend I've cut my TV cord when I haven't really. There's so much more on the site than this abusive trash.

I was going to make a longer comment but there isn't much to say. This is just very disheartening if also predictable, and I hope they get lots of deserved grief for it.
posted by Going To Maine at 10:12 PM on June 4, 2019 [3 favorites]


My understanding is that thanks constant changes to YouTube's algorithm, the niche content creators on YouTube see less and less engagement and earn less and less ad revenue.

Meanwhile right wing assholes (and other kinds of trolls) aren't just tolerated on YouTube, they are promoted because their engagement comes packaged and ready-made. High engagement = more ad money for the content creators and more money for YouTube.
posted by muddgirl at 10:17 PM on June 4, 2019 [13 favorites]


More money for Google / Alphabet. Don't ever stop rounding up when it comes to YouTube money.
posted by hippybear at 10:21 PM on June 4, 2019 [6 favorites]


That is to say this is not at all the first I can think of where YouTube refuses to enforce their own content policies against creators with large fan bases. Channels like 5 minute crafts repackage the same video clips over and over which is against YouTube terms but they have 55 million subscribers so YouTube don't care. Of course this example is more innocuous but allowing sites to basically game their algorithm punishes creators who make actual good original content.
posted by muddgirl at 10:21 PM on June 4, 2019 [6 favorites]


I think this shows that Trump's blatant attempt to intimidate YouTube and the other social media is succeeding.
posted by jamjam at 11:04 PM on June 4, 2019 [4 favorites]


@ShadowTodd

"YouTube wouldn't have to worry about "censoring" people if it wasn't trying to be the one and only option for video. Its status as a monopoly burdens and impedes any effort to moderate its content, and regulators should probably rid YouTube of that burden sooner rather than later"
posted by The Whelk at 11:14 PM on June 4, 2019 [53 favorites]


Probably we should be clear about this, right? This is YouTube — and google — explicitly supporting the Nazis. It’s as good as a fucking press release.
posted by schadenfrau at 11:16 PM on June 4, 2019 [46 favorites]


By far the most frustrating thing about this is that Youtube's "the videos as posted don’t violate our policies" is so obviously a lie. The videos are so clearly against their policies that @TeamYoutube couldn't even avoid using the same language that appears on their policy pages in the tweets.

There's a little more of that in the WaPo coverage of the story:
YouTube spokeswoman Andrea Faville said that Crowder has not ever instructed his followers to harass Maza.
But the Youtube policy forbids "content that incites others to harass or threaten individuals", which is a much broader set of behaviors than literally instructing his followers to harass Maza.
posted by peeedro at 11:49 PM on June 4, 2019 [13 favorites]


YouTube knows this is a problem

And that's the crux right there for me. They, like Facebook, know that they have a huge problem with fascist content, they can't not know this. And even when it's brought up, they dodge, deny, and equivocate. It's appalling.

There are no "edge cases". If something is an edge case it's removed. That, I can guaran-goddamn-tee is what they do with staff communication internally. It's what every sane risk management approach would have you do. Imagine if their employees were posting this shit on the company intranet, about company staff? It would be gone so fast youd see a hazy outline of leftover atoms.

But the public doesn't deserve the protections that YouTube unthinking ly offer their staff. It kills me.

The only thing I can imagine will help is that they get the shit sued out of them. They've already facilitated murders in Sri Lanka and other places, it hasn't been enough.
posted by smoke at 11:52 PM on June 4, 2019 [20 favorites]


Does anyone offer a good (moderated, kid-safe) alternative to YouTube?
posted by pracowity at 12:18 AM on June 5, 2019


No. This is the problem.
posted by Merus at 12:19 AM on June 5, 2019 [9 favorites]


gotta phrase it in a way that suggests ad revenue is at risk

Who are the biggest advertisers on YouTube? Because that seems to be the only pain point they might listen to.


Or boycott the thing. I don't mean just as viewers; creators need to remove their content and be vocal about it.

It makes sense too, not just as a statement. Any video you post is likely to appear on a page with recommendations for hate speech or pedophilia or whatever YouTube's algorithm of the day finds profitable. Which means posting on YouTube can help draw viewers to such content. Which is an unacceptable sort of complicity to have foisted on us.
posted by trig at 12:57 AM on June 5, 2019 [2 favorites]


Yeah, I agree with a boycott - maybe Vimeo could carve out a niche for itself as a non-fascist video hosting company
posted by The River Ivel at 2:23 AM on June 5, 2019 [2 favorites]


maybe Vimeo could carve out a niche for itself as a non-fascist video hosting company

They'd have to get rid of all the explicitly fascist content they're currently hosting, then.

Vimeo are not aby better than YouTube on this front. There's only less of it because they're smaller.
posted by Dysk at 3:26 AM on June 5, 2019 [22 favorites]


Does anyone offer a good (moderated, kid-safe) alternative to YouTube?

If you're in the US, the PBS Kids video app.
posted by soren_lorensen at 3:57 AM on June 5, 2019 [12 favorites]


Does anyone offer a good (moderated, kid-safe) alternative to YouTube?

The PBS Kids app. I'm removing the YouTube app from my devices.
posted by Karaage at 3:58 AM on June 5, 2019 [5 favorites]


Implicit in this whole controversy is that the right is much better organized than those of us who don't hate immigrants or gay people.

Google/Youtube likely won't take down the Crowder video because it is afraid of a backlash from reactionaries and their mass media echo chambers.

Google/Youtube also apparently has determined that right wing videos make them more money. Based on my following of the right for 25+ years, I imagine this is because the right cares about politics far more than we do.

This is anecdotal, I know, but in my experience many, many conservatives are happy to listen to Rush Limbaugh and Sean Hannity all day at work, and then go home and watch Fox, and play Jordan Peterson and Ben Shapiro videos all night.

I've never seen similar engagement from such broad swathes of the left, and this is (I reckon) one of the reasons all of us constantly get recommendations for right wing videos.

So the answer for this is not only to break up Google, but also to get organized and wield power over these radicalizing forces.
posted by NBelarski at 4:09 AM on June 5, 2019 [17 favorites]


Google/Youtube also apparently has determined that right wing videos make them more money.

Part of the reason for this might also be commercial entities being less keen on advertising alongside anticapitalist content than, well, pretty much anything else. Right wing media is an inherently better product than left wing media in the age of advertising funding.
posted by Dysk at 4:16 AM on June 5, 2019 [10 favorites]


They're also incredibly well funded and organized.
I've noticed over the past year an increase in ads from anti woman/equality/prowhite PraegerU ads which specifically target people like me, which apparently is "dudes who watch let's play videos". Given the age range and how some of the worst gamergate type goons come out of that demographic it's very clear what their intention is as part of the larger cultural movement. The videos seem well produced and have the 'im logical I'm just asking questions and saying what needs to be said' veneer that falls apart with any sort of critical examination but they're looking to convert, not debate.

That also leads to one of my biggest irritations which is that I can't tell YouTube to stop showing me video ads from sources I find objectionable.
posted by Karaage at 5:02 AM on June 5, 2019 [15 favorites]


I just wish there was a way to turn the recommendation algorithm off. Show me all the ads you want, monetize my eyeballs all the livelong day, but give me the option to turn off the recommends the same as how you can turn off autoplay. (I know I know, the algorithm is designed to keep people watching in order to serve more ads.)

I also recently discovered that the YouTube Kids app is (intentionally?) hobbled. We used the (already very buried in the settings) Whitelist function where we whitelisted certain videos and channels and that's all my kid could get to. Except that when you Whitelist a channel, you don't get all that channel's videos, just a selection. So my kid (who uses YouTube basically as a cake decorating tutorial delivery device and never ever deviates from that mission) kept asking where all those other videos were that he'd been happily watching prior to us installing YT Kids and uninstalling the regular app. The channels were whitelisted but the videos were absent.
posted by soren_lorensen at 5:28 AM on June 5, 2019 [10 favorites]


YouTube standing outside of a smoldering building: "We have all kinds of fire suppression equipment!"


YouTube standing outside of a burning building: "We are aware of how the burning building makes you feel but remember, we have fire suppression equipment!"


YouTube standing outside the pile of smoldering rubble: "This is fine. *Also: We have fire suppression technology.*"
posted by zerobyproxy at 7:02 AM on June 5, 2019 [10 favorites]


Implicit in this whole controversy is that the right is much better organized than those of us who don't hate immigrants or gay people.

They are better funded, by a bunch of extremely creepy billionaires. They have the billionaires on board. They have the money.

We have, for the moment, more people. The only way to counteract the money is with political -- as in, of the people -- power.

But I don't think that balance will last forever if we continue to cede the modern means of propaganda to the right/nazis, which they are using to actively recruit as many angry young men as possible. So we need legislative hearings about social media and youtube and the rest. If we can't do it at the federal level, we need to do it at the state level. New York and California should be sufficient to seriously fuck with them, but we should take them on wherever we can. (And considering the corruption in NY, might be more realistic to start it in other states with less fuckery.)

We have to do it in the legislatures. That's the only conceivable place where we might have an advantage.

And we have to do it soon.
posted by schadenfrau at 7:06 AM on June 5, 2019 [20 favorites]


I just wish there was a way to turn the recommendation algorithm off.

I've approximated this by watching videos almost exclusively in incognito windows. My signed-in page is more or less static at this point -- almost all tech stuff, scifi stuff, car stuff.

I can tell that google is cheating and paying attention to what I watch in incognito windows. I sometimes watch forgotten weapons and c&rsenal, because what could be more interesting than mechanical widgetry combined with the history of bureaucratic decision-making? And McCollum's visible pain anytime he has to simplify something "too much" is charming. But anyway the usual screaming-whackjob gun shit appears on my feed occasionally so CAUGHT YOU GOOGLE! but at a slow enough rate that it's easy enough to keep under control.
posted by GCU Sweet and Full of Grace at 7:11 AM on June 5, 2019 [4 favorites]


It is unfortunate that a company is so big that it can get away with putting our rainbow on its logo, making money off of us, while also profiting handsomely from people who inspire institutional violence towards us. I hope the antitrust regulators finally catch up with Alphabet/Google.
posted by They sucked his brains out! at 7:40 AM on June 5, 2019 [10 favorites]


I appreciated this insight from Julia Carrie Wong yesterday, breaking down the way YouTube is differently bad than Facebook, Insta, etc... since, despite their claims to the contrary, they are actually an employer and pay hatemongers to produce content.
posted by latkes at 7:58 AM on June 5, 2019 [15 favorites]


despite their claims to the contrary, they are actually an employer and pay hatemongers to produce content.

Yup, according to this rough payment estimator YouTube is probably paying Crowder a six figure salary.
posted by parallellines at 8:14 AM on June 5, 2019 [7 favorites]


In the Visa thread there were some comments skeptical about tech companies aligning themselves with an increasingly fascist government, so it's "nice" to have proof so quickly afterwards.
posted by Memo at 8:14 AM on June 5, 2019 [4 favorites]


But come on now, if a parent is using YouTube as a cheap ass baby sitter, what the hell were they expecting?

My town's school system handed out ChromeBooks to all of the kids in 6th grade and up, and they have carts of them rolling around the elementary schools. The network admins filter what the laptops can reach (e.g., my home Calibre collection is off0-limits), but literally all of YouTube is wide open because the electronic textbooks (Pearson) have content there, and also they use the Google Docs platform for all schoolwork.

As a result, the kids pop open their ChromeBook when class starts, quietly sneak in an earbud, and navigate to some dodgy web site to watch movies all damn day. It's stupid and wasteful and dumb, plus also at home they can say they're "doing schoolwork" and watch hate videos or Twitch streams or airhead YouTubers prattle all night.

I never got much use out of youTube, but they're really making themselves both unnecessary and undesirable at this point.
posted by wenestvedt at 8:20 AM on June 5, 2019 [10 favorites]


This is anecdotal, I know, but in my experience many, many conservatives are happy to listen to Rush Limbaugh and Sean Hannity all day at work, and then go home and watch Fox, and play Jordan Peterson and Ben Shapiro videos all night.

That doesn't indicate a greater interest in politics, it indicates a greater interest in bigoted hate. The fact that it passes for political engagement is an enormous and systemic problem.
posted by poffin boffin at 8:34 AM on June 5, 2019 [35 favorites]



As a result, the kids pop open their ChromeBook when class starts, quietly sneak in an earbud, and navigate to some dodgy web site to watch movies all damn day. It's stupid and wasteful and dumb, plus also at home they can say they're "doing schoolwork" and watch hate videos or Twitch streams or airhead YouTubers prattle all night.

This is a dark vision of The Children.
posted by Going To Maine at 8:35 AM on June 5, 2019


It is unfortunate that a company is so big that it can get away with putting our rainbow on its logo, making money off of us, while also profiting handsomely from people who inspire institutional violence towards us. I hope the antitrust regulators finally catch up with Alphabet/Google.

It isn't just a result of bigness. Tumblr is doing all sorts of pride related stuff now, just months after decimating their LGBTQ community.
posted by Ray Walston, Luck Dragon at 8:40 AM on June 5, 2019 [13 favorites]


This plugin for firefox, Unrecommender, removes all youtube recommended videos except for those in the same channel. I use it and it seems to work.
posted by maxwelton at 8:42 AM on June 5, 2019 [21 favorites]


The way that Google and YouTube have turned out, I now think "Don't Be Evil" was completely ironic from day one, a cynical motto written by a couple of fascists as a bro-joke. Is there a single action Google/Alphabet has taken as a company which can actually be said to reflect that philosophy?
posted by maxwelton at 8:45 AM on June 5, 2019 [11 favorites]


This is also why we must always, always be cynical about "policies" which are worse than nothing if not enforced. Getting a big company to spew out some pretty words is not a victory unless there are actual consequences--people losing jobs, "creators" being kicked off the platform, apologies or reparations issued.
posted by maxwelton at 9:00 AM on June 5, 2019 [5 favorites]


Gizmodo has some further follow up:
So in other words, YouTube’s stance is apparently that it is okay for a host with millions of subscribers (3,846,360 as of early Wednesday a.m.) to repeatedly engage in racist, homophobic bullying so long as it’s couched as part of some kind of ambiguously defined ‘debate.’ This is not only a fundamental misunderstanding of the intent of hate speech, which is not to “debate” or “respond” but to dehumanize, but is almost indistinguishable from bad-faith rhetorical arguments offered up by people spreading hate speech. In fact, Maza said that in 2018 he received hundreds of anonymous texts saying “debate steven crowder.”
This is why the "debate" argument and euphemistic language is so damn harmful - because they are the tools that bigots use to obfuscate their hate.
posted by NoxAeternum at 9:07 AM on June 5, 2019 [33 favorites]


This is why the "debate" argument and euphemistic language is so damn harmful - because they are the tools that bigots use to obfuscate their hate.

Not only that, but also literally begging the question, by presuming that their hate belongs on an equal footing with those they'd dehumanize, and that others should have to ask nicely and convince them with argument not to be homophobic, racist, or what-have-you.

There's no "debate;" racist, homophobic, misogynist hate speech has no place in our society.
posted by Gelatin at 9:23 AM on June 5, 2019 [11 favorites]


Casey Newton at The Verge: "YouTube just banned supremacist content, and thousands of channels are about to be removed"
Newton also notes that this is "unrelated to the ongoing controversy over harassment".
posted by Going To Maine at 9:24 AM on June 5, 2019 [6 favorites]


YouTube succeeds at couching the argument in terms of "debate" because free speech has been corrupted from being an inalienable human right to a corporate right.

YouTube can play this free speech card without suffering any consequences. It is large enough to dictate to our community (and to others) that its "free speech" business model will be placed at a higher priority than our physical safety.

When you have a small business in your town — a gas station, say — fly a Nazi flag and burn a cross on its lawn, you can decide to drive past that gas station and go to another that doesn't fly Nazi flags and burn crosses on its lawn.

In the case of YouTube, you can't avoid it. You can't drive past it. Everywhere you go, it is there.

YouTube uses the vast, global resources of its parent company — Alphabet/Google — to distribute its wares, from the ecosystem of the web browser (Chrome), to consumer products (ChromeBook, Chromecast, Android), to the networking infrastructure that Google owns (fibre, CDNs), in all contexts: home, office, schools, coffee shops, etc.

If YT had to act on its own, it would be more answerable to the public for its actions. Instead, it can make up random corporate policies and enforce them, or not, as it sees fit, and how it sees fit.

Its reach is vast and mostly unassailable, and it is demonstrably unaccountable by virtue of being a part of a larger organism. It is one reason why antitrust regulators must break up the parent company.
posted by They sucked his brains out! at 9:33 AM on June 5, 2019 [11 favorites]


That sounds more like a description of monopoly power, which youtube would still have separate from Alphabet/Google. Facebook doesn't have an ecosystem of browsers, devices, telecoms, etc, for example, and has a similar position with regards to being big enough to be utterly unaccountable.
posted by Dysk at 9:39 AM on June 5, 2019 [5 favorites]


In the case of YouTube, you can't avoid it. You can't drive past it. Everywhere you go, it is there.

This metaphor is a little broken - I never drive past nazi content on YouTube, except when I seek it out. Rather, it makes it very easy to wander into the nazi / whatever vileness shopping district because of related content metrics. If you don't know where to go, those areas will get a guided tour at some point after Logan Paul Lane and Music Video Square because they are all popular.
posted by Going To Maine at 9:52 AM on June 5, 2019 [1 favorite]


Anyhoo, good time to perhaps flag some Tucker Carlson videos for advocating white racial superiority.
posted by Going To Maine at 9:53 AM on June 5, 2019 [2 favorites]




YouTube Is Finally Banning Nazis, Holocaust Denial, and Sandy Hook Truthers (Vice)
YouTube also said it's attempting to reduce the recommendation of videos that come “right up to the line” but stop just short of being banned. In January, the company piloted a program in the US that attempted to reduce the recommendation of videos that come “right up to the line” but stop just short of being banned. The recommendation system used by Youtube has long been criticized as being a rabbit hole that leads to more extreme videos and can have a radicalizing effect on both viewers and creators.

“This change relies on a combination of machine learning and real people,” reads the blog post. “We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations.”

YouTube said “the number of views this type of content gets from recommendations has dropped by over 50 percent in the U.S” and that the platform is hoping to bring the system to other countries before the end of the year. The company also said that channels that toe the line of hate speech but don’t explicitly cross the line will face monetization penalties like not being able to run ads or use the Superchat feature.
The blog post says:
You might remember that a few years ago, viewers were getting frustrated with clickbaity videos with misleading titles and descriptions (“You won’t believe what happens next!”). We responded by updating our system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often. More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles. [...]

We’ll continue that work this year, including taking a closer look at how we can reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines. To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11. [...]

This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations. These evaluators are trained using public guidelines and provide critical input on the quality of a video. This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States.
More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles.

WTF
posted by Little Dawn at 10:28 AM on June 5, 2019 [5 favorites]


At the root of the problem is not really content policy but business goals and code-- as long as the algorithm is tuned to increase "engagement" at all costs, the site will remain broken in the ways described in this thread.
posted by gwint at 10:32 AM on June 5, 2019 [2 favorites]


The problem is, the household spirits, they can't hear negating words. You say "Don't Be Evil", all they hear is "Be Evil". Wishing for something not to happen is the same as wishing for it to happen. "Don't Be Injured" is never the prayer. "Please Be Unharmed" is the prayer. And so on.

The problem is that the "house spirits" are republican political operatives who both sought and received invitations into the house from google and all the other silicon valley companies after spending a few years haranguing them for liberal bias. They are now on the boards of directors and in upper level executive positions for the exact of purpose of ensuring this content doesn't get censored or buried.

This really isn't an algorithm issue at all. It isn't even mysterious. This is a "management making a choice" issue.

As always remember these companies make different decisions about how to handle this content in other countries because they have to due to regulation. They are choosing not to in the USA.
posted by srboisvert at 10:38 AM on June 5, 2019 [18 favorites]


19 hours after saying that there's nothing they could do about Crowder, now YouTube says they're demonetizing his videos.

Like so many of these tech responses to abusive behavior from right-wing assholes, it really seems like they didn't understand (and still don't) that anyone was actually upset by what Crowder did and thought they could just throw up their hands and everyone would move on. Then, as soon as the threat of losing money and prestige comes up, whatever dumbass principle they were standing on before is suddenly unimportant and the asshole of the week is unceremoniously dumped. "We'll consider enforcing boundaries if you can find a way to make this into a huge problem for us" is almost a worse policy than having no policy at all, because now every creep knows that as long as they pick a target that's doesn't work for somewhere like Vox and/or as long as you don't quite get to Crowder's level of vileness, you're totally safe.
posted by Copronymus at 12:02 PM on June 5, 2019 [29 favorites]


YouTube got back in touch with Carlos Maza to let him know that Crowder'd been demonitized.

Maza was.... let's say, less than impressed.

(edit: Jinx!)
posted by Rev. Syung Myung Me at 12:02 PM on June 5, 2019 [12 favorites]


This is anecdotal, I know, but in my experience many, many conservatives are happy to listen to Rush Limbaugh and Sean Hannity all day at work, and then go home and watch Fox, and play Jordan Peterson and Ben Shapiro videos all night.

The stronger the cognitive dissonance, the larger the dose needed to continue to keep ignoring how terrible and harmful the views and people you support have become.
posted by straight at 12:11 PM on June 5, 2019 [2 favorites]


More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles.

What's the joke? "I bought a blender on Amazon, and now Amazon thinks I'm going on a lifelong journey of blender collecting."

Anyway, the reasons are intermingled: make money, 'engage' users (and we're engaged here too!) and probably similar to twitter: Twitter Can’t Ban Racism Because They’d Have to Ban Republicans, Too
posted by the man of twists and turns at 12:18 PM on June 5, 2019 [10 favorites]


More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles.

I. . . somehow doubt the issue was with cookie recipes.
posted by dinty_moore at 12:31 PM on June 5, 2019 [10 favorites]


fuck them

do not bring snickerdoodles into this

they are the height of cookie art, simple yet refined

this shall not stand
posted by RolandOfEld at 12:36 PM on June 5, 2019 [11 favorites]


It's a code word.
posted by hat_eater at 12:42 PM on June 5, 2019


Plot twist!
posted by grandiloquiet at 12:45 PM on June 5, 2019 [9 favorites]


jesus fucking christ
posted by Copronymus at 12:48 PM on June 5, 2019 [4 favorites]


Burn it all down
posted by nubs at 12:52 PM on June 5, 2019 [1 favorite]


I just want to be able to block channels that I never want to see again.
posted by Mr.Encyclopedia at 12:55 PM on June 5, 2019 [21 favorites]


this sucks. on the bright side, i prefer this world, where pressure can be brought down hard on one big company, youtube, and they eventually are shamed into deplatforming these nazis, as opposed to a world with many decentralized, equally well-known media platforms where nazis could simply take some over for good.

as it stands, when nazis get ejected from facebook or youtube, they end up yelling at each other on gab or some other hellhole where nobody can hear them screaming into the void. imagine if gab and youtube had equal recognition, like nbc and abc. that would be far worse and harder to stop.
posted by wibari at 1:04 PM on June 5, 2019


as it stands, when nazis get ejected from facebook or youtube, they end up yelling at each other on gab or some other hellhole where nobody can hear them screaming into the void. imagine if gab and youtube had equal recognition, like nbc and abc. that would be far worse and harder to stop.

I can't tell if you're purposefully referring to fox news or not. Am I ruining the joke? Sorry for maybe ruining the joke.
posted by dinty_moore at 1:06 PM on June 5, 2019 [3 favorites]


I can't tell if you're purposefully referring to fox news or not.

haha, no. i meant it would be bad if the relationship between youtube and gab or 8chan was on the same sort of equal footing as the relationship between abc and nbc, i.e., if they were seen as equally valid and popular outlets. i was not saying that there are no mainstream fascist outlets on tv, obviously.
posted by wibari at 1:23 PM on June 5, 2019


I just want to be able to block channels that I never want to see again.

The society-destroying AI that YouTube has become the vessel of cares only for engagement and will never allow the blocking of content that may someday create engagement. Sure, it's become a psychological torture device that outputs violent racism and homophobia while now apparently stoking pedophilia, but people keep clicking, so what is YouTube to do, not worship the concept of engagement for its own sake? Never.
posted by Copronymus at 1:25 PM on June 5, 2019 [4 favorites]


I'm going to stop updating after this, because it's clear (and it has been from the start) that they're just flailing around trying to find the magic decision that will keep enough people from being apocalyptically angry at them without actually cleaning house, but now YouTube is saying again that the channel IS demonetized. Uh. I think.
posted by grandiloquiet at 1:25 PM on June 5, 2019 [5 favorites]


This trend seems to be moving the world toward decentralized and peer-to-peer technologies that can't be censored by a central authority or single point of failure.

Do we really want YouTube to be the arbiter of the First Amendment?
posted by theorique at 1:29 PM on June 5, 2019


Do we really want YouTube to be the arbiter of the First Amendment?

Youtube isn't the government.
posted by Groundhog Week at 1:37 PM on June 5, 2019 [22 favorites]


The issue here is that "de-monetization" is a figleaf. Crowder's income is maybe 5% YouTube ad revenue, and 95% merchandise. He's 100% (or at least 95%) happy to have a popular YouTube channel that drives sales of his merch.

De-platforming is the only way.
posted by explosion at 1:38 PM on June 5, 2019 [28 favorites]


I'm going to stop updating after this, because it's clear (and it has been from the start) that they're just flailing around trying to find the magic decision that will keep enough people from being apocalyptically angry at them without actually cleaning house, but now YouTube is saying again that the channel IS demonetized. Uh. I think.

Their entire response to this has been a wild ride of confusion. It's like they're trying to create an example for future PR people to study as the worst possible response to being accused of harboring harassment.
posted by Copronymus at 1:41 PM on June 5, 2019 [5 favorites]


More importantly, there's more to free speech than the First Amendment, and at least from my point of view, reaching for the First demonstrates a certain unseriousness when discussing free speech.

The problem we've been seeing more and more is that people have only a low level understanding of free speech, and so don't understand how things like the heckler's veto actually undermines free speech. It's hard to say that speech is free when people are afraid to speak up for fear of becoming a target of abuse.
posted by NoxAeternum at 1:44 PM on June 5, 2019 [14 favorites]


That doesn't indicate a greater interest in politics, it indicates a greater interest in bigoted hate. The fact that it passes for political engagement is an enormous and systemic problem.

I dunno – to me, the fundamental question of politics is "how should power be allocated"?

These folks believe that straight white Christian men should have power, and that other people shouldn't.

It's disgusting and un-American, but it's definitely political.
posted by escape from the potato planet at 1:57 PM on June 5, 2019 [5 favorites]


The people can find other venues for their platform, but the First Amendment comes before all, and it should.

Free speech is important as well, but it is the company's choice.
posted by YankeeKing6700 at 2:15 PM on June 5, 2019


The people can find other venues for their platform, but the First Amendment comes before all, and it should.

So, this means you're against defamation laws then?
posted by NoxAeternum at 2:16 PM on June 5, 2019 [2 favorites]


So, this means you're against defamation laws then?

No, it depends on the situation.
posted by YankeeKing6700 at 2:49 PM on June 5, 2019


Do we really want YouTube to be the arbiter of the First Amendment?

I don't. I do want YouTube to take responsibility for the content that they choose to allow on their platform.
posted by Uncle Ira at 3:01 PM on June 5, 2019 [5 favorites]




Do we really want YouTube to be the arbiter of the First Amendment?

This sort of argument comes up with Facebook as well (e.g.: "Do we want Facebook to be the arbiter of true vs. fake news?") but I think it generally overplays the case. All of these companies -- Facebook/Instagram, YouTube, Twitter, etc... -- have already decided that certain things don't belong on their platform: [some forms of] nudity, [some forms of] violent imagery, [some forms of] harassment, etc...

For some reason, when things like this come up, the question is often framed as banning some stuff vs. banning no stuff. But really, the question is whether a platform should be banning this stuff along with all the other stuff they've already banned.
posted by mhum at 3:20 PM on June 5, 2019 [23 favorites]


No, it depends on the situation.

Then you don't actually believe that "the First Amendment comes before all".

This is why I feel that reaching for the First Amendment is a mark of unseriousness in discussing free speech - too many people look at it and revere it, rather than grapple with it intellectually as they should.
posted by NoxAeternum at 3:20 PM on June 5, 2019 [30 favorites]


A lot of people who talk a lot about the first amendment seem to have a very poor understanding of what it actually is.

The first amendment broadly forbids the American government from (among other things) restricting Americans’ freedom of speech.

YT is not the government, so the only way that it could “become the arbiter of the first amendment” is by being expropriated and nationalised by the American government.

Additionally, YT has other markets around the world, including in many of the ~99.5% of countries that aren’t America and don’t have the first amendment. If they wish to avoid liability for hate speech, which many other Western democracies feel is a greater problem in the present day than the threat of restricted speech, then they also need to cater to these markets, their values and their laws.

They also need to consider their own TOS (which rightly ban hate speech and harassment), along with a more fundamental and broader ethical duty not to encourage harm to their users, or others who may be affected by the use of their service.
posted by chappell, ambrose at 3:22 PM on June 5, 2019 [26 favorites]


I just want to be able to block channels that I never want to see again.

I want to be able to white list channels that I trust, or find out about from people I trust (especially when it comes to allowing my kid to view videos). I don't trust YouTube to recommend anything. YouTube is influencing how much trust I have in everything else from google. I am switching back to Firefox and think it might be time to get rid of gmail too.
posted by eckeric at 3:25 PM on June 5, 2019


The interview in link The Whelk posted has Maza's thoughts on the free speech question; the whole thing is worth reading, but in part:
It’s not a genuine concern about free speech. Legitimate threat to free speech is an environment in which bullies and harassers can use massive platforms to target and push people out of the market. The net result of what’s happening now isn’t that queer people and bigots co-exist in harmony and just talk past each other, it’s that queer people and marginalized people get sick and tired of being attacked by armies of idiots, and they leave the platform. Look at what happens on 4Chan, it is not some healthy free speech environment, it is a platform that has been overrun by the worst of humanity because people don’t want to show up to a platform where they feel like their sexuality and identity can become the butt of multi-million view joke just because they showed up. If you gave a shit about free speech you would give a shit about enforcing policies that restrict the worst excesses of bad actors. Policies that allow creators, especially from marginalized communities, to be able to operate with the trust to know they’re not going to be dragged through the mud just for who they are.
posted by peeedro at 3:35 PM on June 5, 2019 [28 favorites]


For those who have concerns about what YouTube is recommending to them specifically- there's a setting in your Google account that lets you turn off the youtube watch history. Once you do that it will only recommend videos from channels that you've subscribed to.
posted by peppermind at 3:45 PM on June 5, 2019 [5 favorites]


Read the twitter thread and, nope it turns out poor Crowder and the 12 people he employs are going to be ruined because *checks notes* Vox is trying to be funny. Or something I can't keep the idiots comments straight anymore.
posted by 922257033c4a0f3cecdbd819a46d626999d1af4a at 3:57 PM on June 5, 2019 [1 favorite]


The problem there is that I refer to my Youtube watch history rather often to find songs I enjoyed while leaving autoplay running, or that had an informative tutorial or something else that interested me and I wanted to find again. My watch history is a valuable resource.

Motherfucking Facebook allows me to outright ban certain advertisers, and directly edit the topics/metadata they use to evaluate which ads to show me. If you’re a white male programmer who enjoys gaming Youtube will NEVER EVER EVER stop suggesting Shapiro or Jordan Peterson videos. You report a dozen or two dozen as hate speech and they’ll vanish for about six months, and then some internal timer ticks over and they’re right back there occupying half your recommended videos. And yes, I’ve already checked my watch history and confirmed that I’ve never watched a video with either of them in the title, nor anything politically conservative. It’s beyond infuriating: I make a serious effort not to be one of society’s monsters, and their platform insists that if I am not one then I must become so.
posted by Ryvar at 4:00 PM on June 5, 2019 [27 favorites]




as said above, this isn’t a free speech issue. I can make a video sharing site for cookies and permabam anyone who adds a comment saying they hate snickerdoodles. My prerogative, not censorship.

Also, this sort of thing makes it clear that Elizabeth Warren is on the right track sniffing around the idea of tech breakups. YouTube has way too much market share and is wielding it in ways that harm consumers.
posted by freecellwizard at 4:25 PM on June 5, 2019 [8 favorites]


it really seems like they didn't understand (and still don't) that anyone was actually upset by what Crowder did and thought they could just throw up their hands and everyone would move on.

because it's clear (and it has been from the start) that they're just flailing around trying to find the magic decision that will keep enough people from being apocalyptically angry at them without actually cleaning house

IMO a large part of this (and many other problems here in the US) is that conservatives - "fringe" and mainstream - have spent the past 40+ years (at least) pushing the idea that only the most blatant, obvious, egregious, direct, and explicit forms of repression and hate speech "count", and that anyone complaining about more subtle & systemic forms of same are either just whiners or trying to game the system for their own advantage. It's the "Mr. X isn't racist, he didn't use the N-word!" argument applied to every-damn-thing. And it's been thoroughly legitimized by the mainstream media.

So a buncha YouTube folks (almost undoubtedly cis white straight males) probably are a little confused - they've spent their entire lives having the idea pounded into their heads that as long as Crowder didn't directly specifically say, "Hey, all my fans should go hassle this Maza guy on social media!", then, welp, it's not actual hate speech and he can't be encouraging them to harass Maza, they're just doing it on their own and Crowder can't control it.

It needs to be made clear to them (and FB & Twitter & & & & ) - ideally with the force of the government behind it, but if not then by public outcry - that cleaning house is the only way for people not to be apocalyptically angry at them.
posted by soundguy99 at 4:30 PM on June 5, 2019 [22 favorites]


Additionally, YT has other markets around the world, including in many of the ~99.5% of countries that aren’t America and don’t have the first amendment.

To add onto this: YouTube operates in markets that see America's stance on free speech as dogmatic and harmful to the marginalised, and permit their governments to restrict certain forms of speech. If come at me saying that the First Amendment is the be-all and end-all of free speech, I'm not going to be particularly sympathetic.

This used to be heresy ten years ago and it's nice that y'all have come around to my way of thinking
posted by Merus at 5:11 PM on June 5, 2019 [16 favorites]


“YouTube's problem will never be solved because the only solution would be to have humans view the content and make a judgment call, and that requires paying them enough to prevent burnout, providing mental health care to prevent PTSD, etc as seen in Facebook's moderation.” @kamilumin
posted by The Whelk at 5:31 PM on June 5, 2019 [14 favorites]


But even in really high profile cases where they obviously have employees responding, they still fuck up that judgement call.
posted by ryanrs at 5:46 PM on June 5, 2019 [2 favorites]


My views on what should be considered actionable hate speech have certainly evolved in the last ten years. A large part of that is the failure of platforms like YouTube and Twitter to deal with harassment.

But even in really high profile cases where they obviously have employees responding, they still fuck up that judgement call.

Yeah. I'm not going to give them any slack because it's a hard problem, because they're not even fucking trying. They know where their money comes from. Now and then they'll make a hollow gesture to try to preserve some plausible deniability for themselves, but they're not making progress because, fundamentally, they don't want to make the kind of progress that's necessary.

It's like the people who whine that we can't prosecute people who send threats on the internet because sometimes it's hard to identify them. Sure, maybe that will be true for specific cases, but you don't get to use that as an excuse until you start taking action against people who use their real names.
posted by Kutsuwamushi at 6:16 PM on June 5, 2019 [10 favorites]


It's almost like taking the decentralized internet and centralizing it into massive difficult to police platforms like Facebook and Youtube was a mistake.
posted by Mr.Encyclopedia at 6:16 PM on June 5, 2019 [7 favorites]


“So basically Facebook pivoted to video, sold a ton of folks on inflated view numbers, YouTube freaked out and crossed FB's worst instincts with their own weaponized AI and now the whole world is broken.” @dansinker
posted by The Whelk at 6:19 PM on June 5, 2019 [5 favorites]


The problem with being funded by advertising is that advertisers are more afraid of being associated with "female presenting nipples" than they are of being associated with Nazis.
posted by Pyry at 6:27 PM on June 5, 2019 [15 favorites]


I'm not an attorney, and I know zero about corporate law outside of a brief pass about monopolies in a social studies class, so forgive if this question is dumb, but... How would they even go about breaking up something like YouTube or Facebook?

YouTube would have to be severed from Alphabet first, but assuming that magically happened, then what? There's not pieces into which it could be broken. Same with Facebook. It is the entirety of the thing.
posted by SecretAgentSockpuppet at 6:39 PM on June 5, 2019 [3 favorites]


Isn't the backlash against youtube a good example of market reaction? This is free market at work. I'd expect conservatives to encourage consumers to speak their minds about how they want the marketplace to be run.
posted by asra at 7:12 PM on June 5, 2019


It’s not a genuine concern about free speech.

YouTube demonitizing/pulling videos is arguably a real threat to free expression in practice in today's world - the key is that it doesn't become any more or less so just because they demonetized Crowder, because their right to do so is exactly the same as it always was. This isn't a judicial precedent.

I'd be shocked if YT didn't take down a few black nationalalists or left-wing radicals in the interest of pseudo-balance. But that is, again, an issue of their policy.
posted by atoxyl at 7:50 PM on June 5, 2019 [1 favorite]


It's absolutely bonkers that no one even suggests that the FCC regulate YouTube content. We're so used to deregulation we don't even consider it anymore. But our government still bans people saying 'fuck' on the radio. So they can goddamn make some rules to put a leash on YouTube.

The 'free speech' question is just garbage. Basically once you are monetized to a certain degree, the government should be able to control what you amplify into the world.

It's one thing for people to create their own little websites that say their stupid (or brilliant) stuff. But when you have massive multinational corporate platforms, our government has a stake in regulating what these corporations are propagating and profiting from.

Ugh. It's hard for me to get mad at Google about this because I'm still just so fucking livid at Reagan.
posted by latkes at 8:01 PM on June 5, 2019 [27 favorites]


The issue isn't that marketers care more about being associated with female presenting nipples than Nazis. The issue is that they still don't believe, to a large extent, that there are Nazis out there on YouTube. It's a "well, there can't possibly be, YouTube would have taken care of that!" sort of problem.
posted by rednikki at 9:46 PM on June 5, 2019 [2 favorites]


The power of the algorithms scrutinised a little by Oscar Schwartz

https://thebaffler.com/latest/unpopular-content-schwartz

Me - I don't watch TV (unless laid up in a hospital) and at least that has a decent standard of production. In a month, I might watch one YouTube video, if it comes recommended from a highly trusted source.

Which looks like the right call.
posted by Barbara Spitzer at 11:58 PM on June 5, 2019


I also am not a lawyer of any stripe, but Elizabeth Warren's plan for breaking up fb and google is to separate the company-that-runs-the-marketplace from the company-that-competes-in-that-marketplace. If the post-youtube entity that got fined for carrying hate speech was not the same entity that collected advertising revenue for displaying hate speech, it'd have much less incentive to allow it.

I have no idea how, in practice, this would work.
posted by Fraxas at 12:07 AM on June 6, 2019 [2 favorites]


This trend seems to be moving the world toward decentralized and peer-to-peer technologies that can't be censored by a central authority or single point of failure.

Does it? I see no evidence that this is moving the world in that direction. You might like for that to be the case, and people regularly make the argument that there's no point in any regulation of hate speech, people will just set up their own decentralised/deregulated systems, but in practice that does not seem to be anything but an empty threat to discourage any action on harassment and hate speech.
posted by Dysk at 2:24 AM on June 6, 2019 [9 favorites]


Indeed, no such trend exists. In terms of platforms the height of the decentralized internet was over a decade ago. A handful of platforms have largely taken over the space of what used to be tens of thousands of little microcosms.

Even most of the load-balancing optimization that benefits from decentralization goes through a handful of services, notably Cloudflare (who host Gab, perhaps coincidentally).
posted by aspersioncast at 5:16 AM on June 6, 2019 [1 favorite]


Even most of the load-balancing optimization that benefits from decentralization goes through a handful of services, notably Cloudflare (who host Gab, perhaps coincidentally).

Oh, I doubt it's coincidental. Cloudflare's owner and CEO has said that he has to work with terrorists and Nazis because of freedom.
posted by NoxAeternum at 6:33 AM on June 6, 2019 [4 favorites]


Looks like this is percolating up the management chain now. Instead of relying on the anonymously staffed Twitter account @TeamYouTube to try to deal with this, they've put up an official blog post last night signed by Chris Dale, Global Head of Communications and Public Affairs for Youtube (according to Linkedin):
There are two key policies at play here: harassment and hate speech. For harassment, we look at whether the purpose of the video is to incite harassment, threaten or humiliate an individual; or whether personal information is revealed. We consider the entire video: For example, is it a two-minute video dedicated to going after an individual? A 30-minute video of political speech where different individuals are called out a handful of times? Is it focused on a public or private figure? For hate speech, we look at whether the primary purpose of the video is to incite hatred toward or promote supremacism over a protected group; or whether it seeks to incite violence. To be clear, using racial, homophobic, or sexist epithets on their own would not necessarily violate either of these policies. For example, as noted above, lewd or offensive language is often used in songs and comedic routines. It's when the primary purpose of the video is hate or harassment. And when videos violate these policies, we remove them.

Not everyone will agree with the calls we make — some will say we haven’t done enough; others will say we’ve gone too far. And, sometimes, a decision to leave an offensive video on the site will look like us defending people who have used their platforms and audiences to bully, demean, marginalize or ignore others. If we were to take all potentially offensive content down, we’d be losing valuable speech — speech that allows people everywhere to raise their voices, tell their stories, question those in power, and participate in the critical cultural and political conversations of our day.

[...]

In the coming months, we will be taking a hard look at our harassment policies with an aim to update them — just as we have to so many policies over the years — in consultation with experts, creators, journalists and those who have, themselves, been victims of harassment. We are determined to evolve our policies, and continue to hold our creators and ourselves to a higher standard.
posted by mhum at 10:07 AM on June 6, 2019 [1 favorite]


offensive video on the site will look like us defending people who have used their platforms and audiences to bully, demean, marginalize or ignore others

should read: "used our platforms and audiences bully, demean, marginalize"
posted by ryanrs at 11:29 AM on June 6, 2019 [1 favorite]


YouTube demonitizing/pulling videos is arguably a real threat to free expression in practice in today's world - the key is that it doesn't become any more or less so just because they demonetized Crowder, because their right to do so is exactly the same as it always was. This isn't a judicial precedent.

That's kind of what I'm getting at.

As an American, referring "The First Amendment" is a fairly casual and sloppy way for me to indicate "the Western tradition of tolerating free speech, no matter how offensive or noxious it may be".

If YouTube is a near-monopoly or a one-stop-shop for video on the internet, then they are verging on the level of a common carrier or a utility. Even though they aren't the government, if they possess government-scale powers of decision about who may or may not exist on internet platforms, they may need regulation to ensure that they are not discriminating based on (e.g.) political bias, or whether something is "hate speech".

Certainly, based on the obvious coordination and collusion between major platforms to ban dissidents, I don't think we want to trust YouTube (or Apple, or any major corporation) to decide what "hate speech" is or decide that X may remain but Y must be banned.
posted by theorique at 12:20 PM on June 6, 2019


Does it? I see no evidence that this is moving the world in that direction. You might like for that to be the case, and people regularly make the argument that there's no point in any regulation of hate speech, people will just set up their own decentralised/deregulated systems, but in practice that does not seem to be anything but an empty threat to discourage any action on harassment and hate speech.

Maybe I'm a weird, early-adopter, tech nerd, but people in free speech circles talk regularly about Mastodon and BitChute, among other platforms. Obviously these are orders of magnitude smaller than Twitter and YouTube, but with increasing crackdowns and arbitrary bans by the big guys, they may grow. Or, conversely, people may simply get used to YouTube being arbitrary and improvising policy as it goes along.
posted by theorique at 12:23 PM on June 6, 2019


As an American, referring "The First Amendment" is a fairly casual and sloppy way for me to indicate "the Western tradition of tolerating free speech, no matter how offensive or noxious it may be".

I'm tired of the euphemistic language that always gets used to frame the argument from the outset, which is why these discussions seem rooted in bad faith. Hate speech is not "offensive" or "noxious" - it is harmful and causes people to fear for their safety because of how stochastic terrorism works. If the "Western tradition of tolerating free speech" cannot be honest about this point, then it is built on bad faith and can, frankly, go fuck off at this point.
posted by NoxAeternum at 12:32 PM on June 6, 2019 [19 favorites]


As an American, referring "The First Amendment" is a fairly casual and sloppy way for me to indicate "the Western tradition of tolerating free speech, no matter how offensive or noxious it may be".
[...]
they may need regulation to ensure that they are not discriminating based on (e.g.) political bias, or whether something is "hate speech".


Tolerating hate speech may be an American tradition, but it is not one that is a Western one more broadly. This kind of free speech absolutionism is a more exclusively American phenomenon.
posted by Dysk at 12:42 PM on June 6, 2019 [14 favorites]


The odious, oleaginous Ted Cruz figures this phony crisis is another easy target for the "oppressing conservative voices online" audience: "This is ridiculous. YouTube is not the Star Chamber — stop playing God & silencing those voices you disagree with. This will not end well. #LouderWithCrowder" " & "This is nuts. YouTube needs to explain why @scrowder is banned, but @iamsambee (“Ivanka is a feckless c***.”) & @JimCarrey (“look at my pretty picture of Gov. Kay Ivey being murdered in the womb”) aren’t. No coherent standard explains it. Here’s an idea: DON’T BLACKLIST ANYBODY."

Former Kung-fu Monkey blogger Jon Rogers responds:
Well, Senator, @scrowder has been de-monetized (not banned) waging a multi-year war of personal harassment and homophobia against a single person using his followers, while those public personas have been parodying other, more powerful public figures. It's a super easy idea.

The game certain figures on the Right are playing where they pretend not to understand how power works nor, apparently, the free market, is growing increasingly tiresome.

Leading to what is now becoming an evergreen tweet: Yeah, yeah, everybody loves the free market until it bites them in the ass.

A private company uses its market power to jack up medical expenses to the point where people ration their meds and die.
GOP: This is the free market in action.
A private company uses its market power to determine what appears on its platform.
GOP: Time to nationalize YouTube
posted by Doktor Zed at 12:58 PM on June 6, 2019 [10 favorites]


theorique: If YouTube is a near-monopoly or a one-stop-shop for video on the internet, then they are verging on the level of a common carrier or a utility.

I'm actually somewhat sympathetic to this view. In a similar vein, banning games off of Steam is a much higher stakes move given their market dominance than, say, a world where people downloaded their PC games from any of a dozen or so sites. And yet I will reiterate that YouTube already censors a bunch of stuff including most nudity (never mind practically all pornography) and this somehow isn't considered a First Amendment threat.

"the Western tradition of tolerating free speech, no matter how offensive or noxious it may be".

Yeah... about that... I think this is one of the other blind spots Americans have. The US is basically alone among Western countries in *not* having hate speech laws. Granted, even in the countries that do have them, their enforcement can be rather sloppy and/or uneven but they do exist. I find it a bit strange every time this comes up how someone will claim that hate speech laws are completely unreasonable and unworkable despite the fact that the UK, France, Germany, Canada (aka America, Jr.), etc... all have them. It's not entirely unlike how some American commentators will scoff at the notion of universal healthcare despite, well, y'know.

I mean, here's the thing. Despite this idea that speech is absolutely free in the US, it's very obviously not the case. There are a ton of speech acts that are regulated or prohibited: libel/slander, uttering death threats, incitement to riot, obscenity, various forms of harassment, copyright infringement, plus all kinds of stuff related to commercial speech like false advertising. There's always been recognition that some forms of speech can be harmful. The US differs from other Western countries in its determination of whether or not stuff like holocaust denial or Nazi marches are harmful or not.

In fact, even within American jurisprudence, First Amendment protection hasn't been nearly as consistent as some would imagine. People like to point to Brandenburg v. Ohio (the case which basically determined that Nazi marches were protected speech) as the essence of American free speech protection. Fewer people point out that Brandenburg was overriding Schenk v. United States which found that socialists pamphleting against the draft in World War I was not protected speech.
posted by mhum at 1:05 PM on June 6, 2019 [6 favorites]


Here’s an idea: DON’T BLACKLIST ANYBODY.

This is actually the best answer of all.

What I'd argue for on Twitter, Facebook, YouTube, etc is to allow anybody to unsubscribe from anybody else or block content from anybody else. And a rich set of scripting tools to allow people to (e.g.) block anybody who follows X, or anybody who uses keyword Y.

Then police the site for any illegal content: active, imminent threats to an individual person or persons, child sexual abuse, recordings of violence that are evidence of crimes in progress.

Anything else is permitted. If you don't like it, block it. Use someone else's block script if you're not a techie. Of course, the big platforms don't like it because their whole monetization strategy is to have everybody exposed to everything and then run ads on it.
posted by theorique at 1:05 PM on June 6, 2019


If the "Western tradition of tolerating free speech" cannot be honest about this point, then it is built on bad faith and can, frankly, go fuck off at this point.

There is no such tradition in that the Free Speech Movement Cafe at UC Berkeley is a monument to the fight for free speech. This is not something I've looked at but what puzzles me is how today's progressives criticize free speech as if the events at Berkeley never happened. I think while reactionaries are doing their rhetorical distortions to confuse and distract, it could help for leftists and progressives to be clear about the substantial free speech issues, because that needs to be kept alive too.
posted by polymodus at 1:10 PM on June 6, 2019


Anything else is permitted. If you don't like it, block it.

This is basically saying "if the toxic waste outside bothers you, close your blinds." Hate doesn't go away because you stop looking at it, and neither does its harms.

You may not believe in stochastic terrorism, but rest assured, stochastic terrorism believes in you.
posted by NoxAeternum at 1:11 PM on June 6, 2019 [22 favorites]


Lol at the "Western tradition of tolerating free speech." Sure, maybe for the sociopolitically powerful, but it sure as fuck never applied to the marginalized and/or indigenous populations of "the West" and their colonies.
posted by zombieflanders at 1:16 PM on June 6, 2019 [12 favorites]


This is basically saying "if the toxic waste outside bothers you, close your blinds." Hate doesn't go away because you stop looking at it, and neither does its harms.

How much hate is it my responsibility (or anybody else's) to watch on YouTube or any other venue?

My conception of hate speech may not be yours and vice-versa, but I think we should both have the ability to choose what we want to watch - and not watch - on a platform like this.
posted by theorique at 1:22 PM on June 6, 2019


Do you also have control over the harassment and attacks and murders the content inspires?
posted by zombieflanders at 1:32 PM on June 6, 2019 [7 favorites]


None of this is about freedom of speech. We've got pretty much unlimited free speech. You can buy your own server, host your own website, and host whatever you want to regardless f whether youtube or any other platform hosts you or no.

It used to be that publishing something was expensive, it took effort to get thing printed, to get them out to people,. You had to generate a bunch of physical object and mode them around. People had to seek out stuff to read. Now, it's really, really cheap. So cheap that so many platforms just give it out for free, adn we're drowning in facts and opinion and communication.

Now what's limited is not speech but [i]attention[/i]. And that [i]is[/i] a zero-sum game. If one opinion appears first in search results, it has an advantage over what appears in second. If a handful of people show up in the "up next" list, they have so much advantage over anyone who doesn't.

Frankly YouTube could both fulfil freedom of speech absolutism and be anti-hatespeech by just delisting hateful videos from their recommendations and search engine. Because publishing the video? That speech is on the poster. But recommending a video as something a person likes or is looking for? That's YouTube's speech. They are responsible for that 100%; you don't get to blame an algorithm if you wrote the algorithm and tuned the algorithm to give results you liked.

Of course, that wouldn't stop the right wing from attacking them for it, because this isn't about free speech, it's about control. It's about control over how many eyeballs go where. It's about control of who sets the tone for the national conversation.
posted by Zalzidrax at 1:37 PM on June 6, 2019 [29 favorites]


My conception of hate speech may not be yours and vice-versa, but I think we should both have the ability to choose what we want to watch - and not watch - on a platform like this.

So, it's okay for Steven Crowder to inflame bigotry and harass people into silence as long as you don't have to watch? The whole problem with your view of free speech is that you don't really think that speech can be harmful, either in of itself or through its power to inform and inspire action. And yet we have mountains of empirical evidence showing that speech can harm.
posted by NoxAeternum at 1:40 PM on June 6, 2019 [15 favorites]


Applying the first amendment to corporations is among the most idiotic concessions the American political system has made to capital, which is itself quite an achievement.

The degree to which the Citizens United decision has filtered its way into free speech rhetoric online frankly scares the shit out of me.
posted by aspersioncast at 2:03 PM on June 6, 2019 [13 favorites]


Yeah "you have to give me a platform for my hate speach because of the 1st amendment!!!1" is a pretty juvenile point of view. I'd say the education system has failed these people but.. no wait that's probably true.
posted by some loser at 2:05 PM on June 6, 2019 [6 favorites]


Love 2 have my comment deleted for pointing out plainly that Nazi ideology, and other eliminationist rhetoric, is inherently violent.

[Heya, the only comment you had deleted was a direct response to someone else's deleted comment. I'm riiiiiiight there with you on fuck the Nazis, but please hit us up at the contact form about moderation stuff instead of inlining it in thread in the future.]

posted by cortex at 4:10 PM on June 6, 2019 [2 favorites]


Yeah "you have to give me a platform for my hate speach because of the 1st amendment!!!1" is a pretty juvenile point of view.

As an American who has spent time grappling with how American culture approaches free speech, I'd say that it's more than that. For a society that opines a lot about free speech, American culture has a fundamental unseriousness when it comes to actually thinking about the subject.

A large part of the problem is that many Americans are taught two concepts about speech - "sticks and stones may break my bones, but names will never hurt me" versus Woody Guthrie's famous "This Machine Kills Fascists" sticker on his guitar - without ever realizing that they are mutually exclusive by their very nature. Either speech has the power to change things - and thus has the power to harm; or it's inherently powerless. This results in arguments such as the one posited above, where the answer is to "just ignore" hate speech - an answer that itself ignores that just because you stop listening to hate speech doesn't mean that other people do as well, and as such its harms continue on whether or not you avert your eyes. Fighting speech with speech sounds great on paper, but fails in the real world because the actual effect is to force the dispossessed to have to continually defend their right to exist.

Ultimately, you have to pick a side to believe in, and too many people, when push comes to shove, choose "sticks and stones", even though all the evidence is against it.
posted by NoxAeternum at 5:20 PM on June 6, 2019 [13 favorites]


Mod note: We are not having a debate about free speech starting from a tendentious quote that has nothing to do with the legal definition or functional effects of hate speech, sorry.
posted by restless_nomad (staff) at 7:36 AM on June 7, 2019 [3 favorites]


Theorique, I’m going to assume you are arguing in good faith, but I think you should give that same benefit of the doubt to the people who are pointing out the flaws in your argument. If you, or people who look, or worship like you have never been the target of hate speech, I think it is easy to let privilege blind one to how damaging it is.

In the worst cases, it has driven people to slaughter vast numbers of fellow humans. Hate speech from gamer gate drove hundreds of women off the internet, and some women from their homes. Hate speech normalizes aggression towards people of color, towards people who aren’t the right sort of christians, towards people who don’t have the right sort of sex, or identify as the right sort of gender. It normalizes violence as a means to an end.

Hate speech is not a difference of opinion, it is a direct call to harm or cause to be harmed a person or people.
posted by SecretAgentSockpuppet at 8:00 AM on June 7, 2019 [17 favorites]


Viewpoints from inside:
The Verge talked to four Google employees, most of whom requested anonymity in order to speak freely and without fear of retaliation. “Internal outreach to executives has not been effective in years. They ignore us completely unless there is extreme unrest,” says one employee. “We can’t trust them anymore to listen in good faith.”...

Internally, employees have petitioned YouTube to strip its social channels of Pride branding. They see it as a hypocritical co-opting of their community and symbol while the company is actively damaging the community. One employee referred to it as “mere lip service,” adding that the company has lost its right to use the rainbow flag and other LGBTQ branding by allowing homophobic harassment to exist on its platform.
posted by They sucked his brains out! at 12:29 PM on June 7, 2019 [11 favorites]


I don't like the "Don't ban anybody" idea — after all, look at Info Wars and Alex Jones, with his peddling of the "Sandy Hook was a hoax" conspiracy theory that resulted in — and AFAIK, continues to — harassment and death threats against the parents who lost their children.

Just because someone wants to spew hate doesn't mean they deserve a platform like YouTube to do so.
posted by Rev. Syung Myung Me at 2:01 PM on June 7, 2019 [8 favorites]


Natalie Wynn AKA Contrapoints talks on Current Affairs about YOuTube's right-wing trolls
posted by The Whelk at 9:32 PM on June 7, 2019 [4 favorites]




YouTube's Crackdown on Violent Extremism Mistakenly Whacks Channels Fighting Violent Extremism (Slashdot)
AmiMoJo shares an article by Cory Doctorow:

Wednesday, Youtube announced that it would shut down, demonetize and otherwise punish channels that promoted violent extremism, "supremacy" and other forms of hateful expression; predictably enough, this crackdown has caught some of the world's leading human rights campaigners, who publish Youtube channels full of examples of human rights abuses in order to document them and prompt the public and governments to take action....

Some timely reading: Caught in the Net: The Impact of "Extremist" Speech Regulations on Human Rights Content, a report by the Electronic Frontier Foundation's Jillian C York: "The examples highlighted in this document show that casting a wide net into the Internet with faulty automated moderation technology not only captures content deemed extremist, but also inadvertently captures useful content like human rights documentation, thus shrinking the democratic sphere. No proponent of automated content moderation has provided a satisfactory solution to this problem."
posted by Little Dawn at 10:03 AM on June 8, 2019 [3 favorites]


This deserves its own FPP but “tracking a 21 year old become radicalized into the far right with help from YouTube”

A derail, but I do hope that this article -and the general discussion of Crowder- can really kill the "conservatives can't make jokes or do a Daily Show" meme dead. Comedy is a weapon that anyone can use, and it's very definitely being deployed. Jokes that punch down are mean, but they still amuse.
posted by Going To Maine at 11:27 AM on June 8, 2019


Nah I’m not giving them the mantel of comedy just because they’re cruel, sneering idiocy makes cruel sneering idiots laugh. They can’t do comedy cause they get so wound up and angry about their targets. Laughing while your punching someone doesn’t fall under my metric of comedy no matter how abstract or academic you want to get.
posted by The Whelk at 1:43 PM on June 8, 2019 [14 favorites]


'Being mean is lucrative': queer users condemn YouTube over homophobic content (Guardian)
YouTube’s unclear terms of service make addressing harassment confusing and difficult, [Ash Hardell, a queer and non-binary YouTuber who said they had received little support from the company despite years of harassment,] said, noting that the company appeared to have made them intentionally vague. As of now, YouTube bans “abusive videos and comments” on the site but doesn’t clarify what constitutes abuse.

“If harassment crosses the line into a malicious attack it can be reported and may be removed,” the guidelines say. “In other cases, users may be mildly annoying or petty and should be ignored.” Hardell said because of this wording, it was difficult to tell if Crowder’s videos violated YouTube’s harassment policy. Indeed, YouTube itself seems to be unclear on whether the speech is allowed on the platform. [...]

Demonetizing users can sometimes backfire: as YouTube has attempted to tamp down on “inappropriate” content, whether adult videos, hate speech, or harassment, some LGBTQ creators have been misclassified. Hardell said their videos were deemed “adult content” by the same algorithms meant to protect them, bringing viewership and ad revenue down. [...]

Lindz Amer, a creator of social justice videos on YouTube who is queer and non-binary, said this kind of harassment had been an issue since YouTube’s inception, and that despite a series of high-profile hate campaigns in recent years, “nothing has been done”. “The thing that is most striking about it is that this is a story I have heard from so many people – it’s not a unique situation in any way,” they said. “This is pretty much the norm for social justice creators.”

Despite these frustrations, switching platforms is not an option for many creators. Amer said they had received more than 2m views on their channel, with an average of 100,000 views per video on YouTube. When they tried to migrate content to Vimeo, they got an average of five views per video. “YouTube has a complete monopoly on video hosting, and they know it,” Amer said. “There is no other place like YouTube to get an audience where people can watch for free. They have that advantage and they can steer the conversation and do nothing.”
posted by Little Dawn at 7:57 PM on June 8, 2019 [4 favorites]


There's been discussion of whether Google should be booted from the San Francisco pride parade because of hate speech on YouTube. But let's not just blame YouTube! Journey with me into a thread of political donations Google has made in support of hate.

...What strikes me about Google's political support for these causes is how *unnecessary* it is. Google runs one of the most sophisticated lobbying operations in DC. It could join IBM and Apple in disbanding its PAC altogether, and suffer no consequences. But they just don't care.
posted by The Whelk at 8:27 PM on June 8, 2019 [11 favorites]


Laughing while you’re punching someone doesn’t fall under my metric of comedy no matter how abstract or academic you want to get.

Your own laughter never makes something comedy; it’s the fact that everyone else is laughing too that means that it’s funny. The fact that cruelty can be funny kind of sucks, but that doesn’t make it any less real.
posted by Going To Maine at 12:25 AM on June 9, 2019 [1 favorite]


The fact that cruelty can be funny kind of sucks, but that doesn’t make it any less real.

Just because it's real doesn't mean that we have to call it comedy, though. Part of why bigotry continues to be an endemic problem in comedy is that we don't call it out.
posted by NoxAeternum at 7:57 AM on June 9, 2019 [4 favorites]




So she feels bad but isn't going to change anything? Why even bother saying anything then? And why do they even have terms of service if she's admitting that they're incapable of enforcing them?
posted by octothorpe at 10:08 PM on June 10, 2019


I have only two responses to Ms. Wojcicki:

First, as the old saying goes, foolish consistency is the hobgoblin of little minds.

Second, resign.
posted by NoxAeternum at 1:13 PM on June 12, 2019


« Older Books That People Have Worn Out, Or Are Held...   |   Why is LeftTube so white? Newer »


This thread has been archived and is closed to new comments