An ‘Ecosystem of Hate’
August 13, 2019 4:06 PM   Subscribe

How YouTube Radicalized Brazil. New Study Demonstrates YouTube’s Role In Far-Right Radicalization, Spread Of Conspiracies In Brazil.
ArsTechnica suggests YouTube should stop recommending garbage videos to users and Hacker News says The problem is not the algorithm.

WhatsApp also played a huge role in skewing Brazil's last elections.
posted by adamvasco (65 comments total) 38 users marked this as a favorite
 
Of course Hacker News would argue that the problem isn't technology, because that would actually require self-reflection to argue otherwise. It would be nice if the tech community would actually take some responsibility for once.
posted by NoxAeternum at 4:16 PM on August 13, 2019 [58 favorites]


Holy crap that top HN comment is the most HN comment ever. It’s not our fault! We don’t need to do anything to fix this! We’re just part of society, doing whatever makes us the most money, because that is the sole arbiter of what is right! Morals do not exist outside of our duty to provide a platform for anyone and everyone to have their say, no matter how vile their creed! As long as we are not breaking any laws we have nothing to feel bad about!
posted by egypturnash at 4:26 PM on August 13, 2019 [39 favorites]


Of course Hacker News would argue that the problem isn't technology, because that would actually require self-reflection to argue otherwise

There are two theses to evaluate:

1. The problem is technology.
2. The problem is technologists.

Both require self reflection on part of the HN and Are Technica crowd. And hard as it may be for some Mefites to believe, both sites do have plenty fo reflection on both theses.
posted by ocschwar at 4:27 PM on August 13, 2019 [4 favorites]


The core argument of the Hacker News article: "The problem is the masses that believe everything they watch without questioning it. The problem is not internet communities. The problem is the utter lack of localized communal structures in the modern world which leaves people defenseless in the face of efforts of galvanization toward some other, hateful view."

When I first read the NYT piece on Brazil I had a similar thought: no matter what is "played next" on YouTube, the viewer has to have some susceptibility and reason to find it persuasive. Well before the digital era, new forms of media (e.g. 'yellow journalism') were effective at riling people up.

Reflecting on the insights of this NYT opinion piece, The Religious Hunger of the Radical Right, too, it seems worth looking well beyond "make YouTube filter certain videos" when we try to understand and fight this rising tide of authoritarianism.
posted by PhineasGage at 4:34 PM on August 13, 2019 [4 favorites]


Notably, one person on the linked HN thread put up an argument that "the problem is not the algorithm/it just selects for engagement" (or, to be generous: this is a social problem, not a technical one) and several people responded with "yeah, that's not good enough" in a way that would be pretty much at home on metafilter.*

HN has issues with its gestalt sometimes, but it's not correct to assume that participants are an unreflective monolith, and the pushback deserves as much credit as moments of unreflective technoutopianism.

* judging by the handles involved, two of those respondents may actually be at least onetime MeFites, but presumably their participation there makes them just as much a part of HN as MeFi)
posted by wildblueyonder at 4:34 PM on August 13, 2019 [18 favorites]


The problem is not the algorithm. The problem is management at Google who believe that human curation is bad, and only scalable automated systems will be used at Google, despite that belief starting to kill youtube because algorithms are easily gamed by bad actors. Just whitelist the recommendations for fuck's sake already.
posted by benzenedream at 4:44 PM on August 13, 2019 [11 favorites]


I can't remember where I read it but the explanation I saw was that the algorithm looks for patterns of This Video, Then That Video among users, and that this is incredibly easy for a bad actor to game by getting large numbers of users to watch a popular video followed by the bad content they're boosting, and that creates a connection for the algorithm to suggest to others. I'd hope that explanation is wrong and it's not that stupidly simple, because jesus that's dangerous.

Also even without changing the algorithm or banning a single channel, YouTube could attach unskippable media literacy education preroll content to questionable material to inoculate users to it in like a week if they so chose. It's very low hanging fruit.
posted by jason_steakums at 4:45 PM on August 13, 2019 [8 favorites]


Like the post about Sweden here today too, we're fortunate to have such detailed analyses of how tech and culture are combining to inflame populist messages of hatred. It's happening all over the world, whether Facebook posts encouraging genocide in Myanmar or WhatsApp rumors spreading like wildfire over India or right wing propaganda flooding Brazil. In every case the medium and tech company are American but the effects are global, and unique to each culture and country. The American companies are very poorly equipped to understand or manage it.

The NYT article once again asserts
As the system suggests more provocative videos to keep users watching, it can direct them toward extreme content they might otherwise never find.
I think that's true, that's what the YouTube recommendation system does. In effect, if not intentional design. But YouTube's executives have denied that's what their product is doing time and time again. I don't think they're simply lying in the face of so much evidence, but what are they thinking?

I'm in Metatalk territory here, but I think it's a real mistake to frame this discussion in terms of a stupid Hacker News comment. Metafilter is not Hacker News. I think a lot of us are more empathetic and politically aware than the HN gestalt. We're here on Metafilter because we prefer the style of conversation here. Let's have a Metafilter discussion.
posted by Nelson at 4:48 PM on August 13, 2019 [4 favorites]


Reply All talked a little bit about the algorithm moving people from more popular videos to more focused videos in their episode about Carlos Maza.

It was designed to make people watch more and more niche content, not necessarily more politically extreme content. I honestly think it was an oversight that nobody at youtube could figure out how to correct, so it was easier to deny.
posted by dinty_moore at 4:52 PM on August 13, 2019


I saw when one of the NYT reporters tweeted out the article that YouTube replied:
YouTubeInsider: We know our systems are far from perfect, so we’re constantly making improvements. But we had a team across product and engineering try to reproduce the recommendation results found by the University of Minas Gerais and Harvard’s researchers, and it was unable to do so.

YouTubeInsider: In fact, we’ve seen that authoritative content from Brazilian news outlets (i.e. TV Folha and Jovem Pan) are thriving in Brazil + leading news channels have grown almost 2x in watchtime over the past yr. These news sources are among the most recommended content on the site in Br.
So, yeah. I find it a little alarming that even at this late date, YouTube is still maintaining a "nah, no trouble here. nothing to do with us, nosiree." position. Even Zuck was forced to do a 180 on his whole "fake news had no impact on elections" stance.
posted by mhum at 4:58 PM on August 13, 2019 [11 favorites]


I remain convinced that this is not an accident or just an algorithm glitch. I have had a YouTube account for, what, 13 years, and watch it every day. I curate and select my stuff and follow people providing the content I like.

Never in all that time have I been recommended anything like what I actually watch, but for some general interest music or film videos.

In the meanwhile, with great frequency, YouTube recommends the shitty stuff, the gateway to white nationalism stuff.

That's not an accident. Why would it consistently, accidentally recommend that and not literally anything else? No, there's a human hand in there, deliberately pushing. I don't know if it will ever come out, but I know it's there. I know when I'm being shoved.
posted by maxsparber at 5:34 PM on August 13, 2019 [57 favorites]


The problem is the UI/UX has no way to explicitly say “never show me this again”. I order to train the algorithm, you have to manually curate your viewing, by going through your history, which is a pain, and then manually marking a video as not what you want. I don’t think you can even do it on mobile, you have to be on the desktop site to do it.
I know this because I made the mistake of watching one boxing video from a link somewhere, and suddenly every video I was being recommended were boxing, MMA, or other sports videos. Just one video and suddenly the system thinks I’m 100% into fighting sports for everything.
It took me a week of constantly clicking on videos I didn’t want to see to get it to stick.
That’s the sign of a poorly designed system.
posted by daq at 6:01 PM on August 13, 2019 [34 favorites]


But they won't, because Google is a corporation responsible only to its shareholders, and it will do whatever it can get away with to make money.

If you want to stop YouTube from wrecking society, you have to make it unprofitable to do so. There is no technological solution. The solution is political: you must make corporations responsible to society. It is that simple, conceptually.

Making it happen is going to take someone like Elizabeth Warren in the WH, and two houses of congress that will follow her lead.

Algorithms are a distraction. Hacker News is a distraction. Get Warren in the White House or we're kind of fucked.
posted by seanmpuckett at 6:04 PM on August 13, 2019 [13 favorites]


I agree with the Hacker News comment that technology is not "the" problem (as if there is a single cause for the spread of violent and hateful ideologies). YouTube recommendation algorithms act as an enabler for it, and YouTube's failure to curate content more carefully is a missed opportunity, it's socially irresponsible and it's negligent, but let's not pretend that people who watch these videos are automatons being programmed with hateful opinions by Google's money-maximizing computers. Everyone has a conscious choice about whether to watch a video that is recommended to them, and has a conscious choice about whether to adopt the opinions in them as their own. I've seen and read a lot of articles about "the Pewdiepie pipeline" and how hapless youngsters have "fallen down an alt-right rabbit hole" and the like as if it's just something that happens to you. How about actually holding people personally responsible for the opinions they adopt?
posted by L.P. Hatecraft at 6:15 PM on August 13, 2019 [3 favorites]


> "Notably, one person on the linked HN thread put up an argument that "the problem is not the algorithm/it just selects for engagement""

Well it doesn't help that, despite a long and steady diet of vintage radio / model engineering / electronics / cat videos, YouTube still tries to 'engage' me with top recommendations & up next's of Joe Rogan, Jordan Peterson, Ben Shapiro, random Russian-language anti-"PC" propaganda and, more frequently recently, Epstein conspiracy videos.
posted by Pinback at 6:17 PM on August 13, 2019 [25 favorites]


One other thing that I've been thinking about is the role of CDA Section 230 is in all of this. The text of it says that
"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider"
This has been interpreted (I think) to mean that FB and YT should be treated more like the phone company or a fibre-optic networking provider when it comes to the content on their platforms. But, both FB and YT have been actually behaving a lot more like publishers than neutral platforms: Facebook at least since they got rid of the chronological timeline, YouTube at least since they started giving video recommendations. The fact that their editing and curation is mainly algorithmic is irrelevant. I can't help but wonder if the mere acknowledgement that they're actually in the publishing business (and then all the attendant responsibilites they would incur) is so risky to their fundamental operations that they can't bring themselves to even consider this?
posted by mhum at 6:17 PM on August 13, 2019 [14 favorites]


I completely disagree that there is not intentional programming being done here, and as the comments above indicate everyone gets exposed to the fishing attempts. Some vulnerable people get sucked in. Twitter thread about similar.
posted by pilot pirx at 6:33 PM on August 13, 2019 [6 favorites]


Fran 'franlab' Blanche did some research on her YouTube channel a while back, and IIRC she published a video noting that YouTube pushed measurably harder her handful of videos ranting directly into the camera about her issues with YouTube than her usual lab/show-and-tell videos. Make of that what you will.
posted by zaixfeep at 7:23 PM on August 13, 2019 [2 favorites]


Also interesting to remember corp sibling Google fired James Damore for publishing his views internally - views I suspect YT algos would push if he vlogged them as a non-employee.
posted by zaixfeep at 7:27 PM on August 13, 2019 [3 favorites]


let's not pretend that people who watch these videos are automatons being programmed with hateful opinions by Google's money-maximizing computers.

Except that people lacking direct evidence make decisions based on a variety of reasons including authority and consensus. Lots of videos are designed to look authoritative, and by pushing a steady stream of the same thing you create the assistance of consensus.
posted by CheeseDigestsAll at 7:40 PM on August 13, 2019 [14 favorites]


People are not robots. People are people. We like sugar, we like drama, we like tribalism. We hate strangers, we hate change, we hate effort. Any corporation which passes along product without regard to the way people actually are ends up feeding our basest urges. One cannot say, "people shouldn't be that way." We are that way. To refuse to recognize that basic fact is an abdication of moral responsibility.

Fortunately, there's another side to us. We want to be better. We want to overcome our hate, our lassitude, our appetites. To recognize that, to provide us content based upon that, would be an acceptance of moral responsibility.

There is no platform out there which follows the latter course rather than the former. Other than Metafilter of course.
posted by mono blanco at 7:44 PM on August 13, 2019 [8 favorites]


A spokesman for YouTube confirmed the Times’ findings, calling them unintended, and said the company would change how its search tool surfaced videos related to Zika.
Patient is bleeding from multiple arteries; Youtube agrees that the scratch on the elbow isn't good, and says it will go find a bandage.
posted by clawsoon at 7:49 PM on August 13, 2019 [10 favorites]


I remember when sites were first doing recommendation challenges. They'd provide an anonymized data set, and your challenge as a programmer was to come up with an algorithm that'd do the best job of matching what someone had already liked with what they might like to watch next.

It seemed like a fun little exercise at the time, though I never had the chops to participate. But then some researchers figured out how to de-anonymize most of the data (bad!) and now the world is being taken over by fascists (very bad!)
posted by clawsoon at 7:53 PM on August 13, 2019 [2 favorites]


YouTube still tries to 'engage' me with top recommendations & up next's of Joe Rogan, Jordan Peterson, Ben Shapiro, random Russian-language anti-"PC" propaganda and, more frequently recently, Epstein conspiracy videos

True story: Despite never logging in to YT, switching VPN endpoints, disabling cookies and as much JavaScript as I can, YT will inevitably always recommend a specific 'Rodney Dangerfield on the Carson show'; video near the top. Every. Darn. Time.

I get no respect.
posted by zaixfeep at 7:53 PM on August 13, 2019 [13 favorites]




FWIW, my yt feed contains absolutely nothing objectionable. On occasion, it does, but it's not common. The only real curiosity upon scanning right now is it offering Dr Who as a topic. I can't remember ever watching anything related to Dr Who. But it does pretty good finding stuff I might find interesting. Which I think is the sad truth about YT: it's pretty good at matching people who seek garbage with garbage.

Curation is great. Just be careful what you wish for. That'll require judgement calls by actual people. There's a whole wingnut caravan dedicated to crying about how the whole internet is biased against them, and they're in full browbeat mode to get kid glove treatment from all the usual suspects. I won't be surprised to find, after all the moaning and groaning about the algorithm from the left, we get instead an army of effort from the social media companies to manually tip the scales in favor of conservative assholery as a result.
posted by 2N2222 at 8:08 PM on August 13, 2019 [3 favorites]


2N2222: Which I think is the sad truth about YT: it's pretty good at matching people who seek garbage with garbage.

Unfortunately, it can be difficult to be inoculated against well-crafted garbage, especially when you're hitting the age when you're starting to "think for yourself" and question whether the stuff that your parents and teachers told you is really true.
The videos appeared to rise on the platform in much the same way as extremist political content: by making alarming claims and promising forbidden truths...
I remember the thrill I got at that age from reading one academic take down the theories of another academic; revisionism, whether it's true or not, is exciting. I get to be right, where the previous generation was wrong!

I'm probably lucky I didn't have Youtube at the time.
posted by clawsoon at 8:16 PM on August 13, 2019 [10 favorites]


Well it doesn't help that, despite a long and steady diet of vintage radio / model engineering / electronics / cat videos, YouTube still tries to 'engage' me with top recommendations & up next's of Joe Rogan, Jordan Peterson, Ben Shapiro, random Russian-language anti-"PC" propaganda and, more frequently recently, Epstein conspiracy videos.

I got the same thing up until two months ago or so... and then it just stopped. I have no idea why. My assumption is they updated the algorithm.

But while it was happening it was really odd. Odd enough it caused some self-doubt like 'what am I doing to result in these matches?' Is it the NFL highlights? Car review videos? Is it just demographic (white, male, unmarried tech worker...)

Even now that the recommendations have returned to normal fare (it even recommends John Oliver) I'm still have a side-eye on YouTube. Also, I'm very happy to see others pushing back against the HN comment - that was just the kind of bonkers thought process that got us into this mess.
posted by elwoodwiles at 8:52 PM on August 13, 2019 [2 favorites]


One trick that's helped me maintain a tolerable youtube feed is to wtch evrything in incognito. Bring it up in normal bt then right click and actualy watch incognito.

Pretty sure google cheats about that, but it helps anyhoo
posted by GCU Sweet and Full of Grace at 9:25 PM on August 13, 2019


Curation is great. Just be careful what you wish for. That'll require judgement calls by actual people.

You mean like every other commercial streaming service?
posted by benzenedream at 10:32 PM on August 13, 2019 [1 favorite]


“The American companies are very poorly equipped to understand or manage it.“
I think they understand it very well - someone who’s mesmerized by extreme content or addicted to watching conspiracy videos 6 hours a day is someone who can be served paid advertisements for 6 hours a day. Me, someone who watches a cat video every few weeks is not their ideal customer.
posted by cricketcello at 10:39 PM on August 13, 2019


a while back some guy named cortex posted an analogy that's really stuck with me. iirc he was talking in particular about how twitter and facebook work, but it can be extended to refer to the youtube recommendation algorithm as well. what he said was something like (this is a loose paraphrase here) "say you want to get acorns out of trees. you decide to do this by putting some black powder at one end of a hollow metal tube containing a metal slug, rigging up a device to ignite that powder when you pull a trigger, and then pointing the tube at an acorn-bearing tree branch and pulling the trigger."

although this is in fact a device that can get acorns out of trees, you haven't invented an acorn fetcher. you've invented a gun. when someone uses the gun to shoot another person instead of to shoot acorns out of trees, you can't just stand back and say "well look this technology has nothing to do with that person getting shot, the technology is neutral and intended for use in acorn fetching, it's not my fault that some fool came around and used it to shoot a person."

youtube's algorithm as designed and implemented isn't a device for surfacing videos that people might find interesting based on their previously viewed videos. it's a device for making fascists, that just happens to have been originally intended to surface videos that people might find interesting based on their previously viewed videos.
posted by Reclusive Novelist Thomas Pynchon at 11:10 PM on August 13, 2019 [44 favorites]


The longer the tail of the distribution -- lots of people who watch a few hours a week, a few who watch dozens of hours -- the more incentive the system has in converting people to addicts. If they make their money via total number of hours of videos/ads consumed, then presumably the alg would rather convert five extra people to intense youtube addicts than to try to nudge 100 people into consuming just a bit more, or making those 100 people happier with what they find (which has no direct value whatsoever). So it's probably much more efficient in aggregate to fish for rare addicts with the tried-and-true conspiracy drugs than try to boost mass engagement, even at the cost of diminishing the browsing pleasure of those masses.

This also seems to be more general than just right-wing cocaine. When we briefly allowed our toddler to watch youtube videos, youtube would of course recommend piles of brightly colored, nearly machine-produced garbage. It sort of worked for a while, but the toddler's interests weren't totally hooked and more importantly, we saw what was happening and cut off youtube permanently. I don't think that stuff was a universal drug for all toddlers though -- there's a co-evolution between the producers churning out vast quantities of machine-generated videos and youtube's algorithms, both working together to fish through toddler mind-space to capture not the median toddler, but a sufficient subset of addictive/unprotected toddlers to drive up total views. The algorithms plus the unseen producers together felt like some vast, unconscious Peter-Watts-like intelligence constantly probing for mental vulnerabilities to exploit, with no concern whatsoever for boosting happiness, learning, or even view count by the average viewer. And since we all have some vulnerabilities, it felt like the longer you left it running, the more likely it was to find your weakness. Burn it with fire.

Anyway, that's all just speculation, but if it's true, one fix would be to force youtube to base ad revenue on total unique individuals instead of total views, which would disincentivize it from trying to maximize total views via a small set of obsessive viewers. What's frustrating is that this probably makes sense from an advertising point of view -- obsessive viewers are probably already saturated -- but since advertising is complete cargo-cult psychology, it's hard to make rational arguments to ad buyers or sellers.
posted by chortly at 11:33 PM on August 13, 2019 [8 favorites]


Except that people lacking direct evidence make decisions based on a variety of reasons including authority and consensus. Lots of videos are designed to look authoritative, and by pushing a steady stream of the same thing you create the assistance of consensus.
That assumes that the problem with fascists that they've just been fed bad information. But is that really it? Perhaps I'm being naive, but it seems like it would take more than just believing certain falsehoods to turn me into someone who believes that minorities should be denied basic human rights. Fact/value distinction etc. A lot of criticisms of Youtube (e.g. at the Hacker News link) say that the problem is that people haven't been educated about how to interact with this new technology. But that rings hollow too. If you catch a computer virus, or someone scams you via a phishing email, that seems like something due to computer-related incompetence. If someone convinces you to be a racist via a YouTube video, that's not a technical malfunction or technical incompetency, it's a problem with the viewer.
youtube's algorithm as designed and implemented isn't a device for surfacing videos that people might find interesting based on their previously viewed videos. it's a device for making fascists, that just happens to have been originally intended to surface videos that people might find interesting based on their previously viewed videos.
Why is this property of Youtube biased towards right-wingers? It seems like all the negative things that people have mentioned about it could be just as easily exploited to propagandize people into supporting left-wing viewpoints. I don't know what the exact algorithm Youtube uses is, but I've seen what is used for recommendation engines on other websites and it's usually something related to the Jaccard Index: the algorithm finds another user with a large overlap in viewing history to your own and then recommends a video that they have watched but you haven't. What is right-wing about that? Why wouldn't that be equally likely to promote left-wing videos to right-wingers?
posted by L.P. Hatecraft at 2:03 AM on August 14, 2019 [2 favorites]


So many problems in society break down to “human beings are flawed and predictably behave in ways that they themselves would agree are flawed” and “we are unwilling to deal the idea of humans as flawed creatures” because that’s complicated and messy and it’s easier to just say “the issue/technology/law is amoral and humans have free will and just need to decide to be less flawed” and the problem fixes itself.

Yeah, most of us are aware that you tube videos/Facebook memes/whatever are subjective and manipulative, but they are designed to engage us on an emotional level more than an intellectual level and emotional truth is deceptive difficult to be consciously aware of at all times. These are known, non-controversial facts. And we could design human-centric systems that take into account this known flaw in human behavior, but instead we design systems that say “this delivered item generated emotional engagement, let’s now try to deliver something that generates even more emotional engagement.”

And now my American college educated mother believes in her heart of hearts that Bill Clinton had Jeffrey Epstein killed.
posted by Slarty Bartfast at 3:14 AM on August 14, 2019 [9 favorites]


> "The longer the tail of the distribution -- lots of people who watch a few hours a week, a few who watch dozens of hours -- the more incentive the system has in converting people to addicts.

That actually … makes sense.

In my case I subscribe to about a dozen channels, one or two of which I'll watch every video. Most of the rest will drift out of my subs & be replaced by something similar, and my total watch time probably averages something like 10~30 minutes per day.

I'd bet that the kinds of things it recommends to me have high viewer to subscriber conversion rates, high "watch all videos"/rewatch/sharing rates, longer run times, and rank much higher by some measures of "stickyness", and the purpose is to increase my "engagement".
posted by Pinback at 3:16 AM on August 14, 2019


The way the recommendation algorithm on Youtube works can certainly be infuriating. I made the mistake once of looking up a couple of videos of Jordan Peterson because I’d never heard of him and was curious and voilà, suddenly got all sorts of recommendations from Prager U and all the connected "alt-right" sources and it took a while to get rid of that (simply by watching other stuff, but I still get recommendations for the Peterson channel). I don’t see anything deliberate in it other than pushing the videos that get more attention and wanting you to get addicted to keep clicking. It happened with far more harmless stuff like watching one skincare routine video and suddenly being flooded with recs for videos about skincare, or watching one interview with a celebrity and getting all sorts of recs for celebrity gossip channels.

So it very well may be that the mechanism itself is politically neutral, it’s just infuriatingly designed to hook you into watching whatever is connected and got more views. The problem is the content that’s allowed to spread.

And it seems that might be an even bigger issue in Brazil with a far less lower threshold for being hooked into a far-right chain of sources. From reading the article, it sounds a LOT worse than the experience with English-language Youtube. I’ve noticed this also on other localized versions of the platform. It could also be that there’s less of an oversight for content created in other languages, that there is also more extreme content allowed on non-English language Youtubes?

I also found this Twitter thread from one of the author of the NYT report on Brazil interesting:
Crucially, the algo is linking the channels together, creating an ecosystem where none had existed. Its pathways just happen to be ideal radicalization vectors. It’s not about any political agenda at YouTube. Incremental extremism — the rabbit hole — is just what works. But it goes beyond indulging preexisting impulses. It trains users to have them.
posted by bitteschoen at 4:54 AM on August 14, 2019 [5 favorites]


the more incentive the system has in converting people to addicts
The whole point is engagement. It's the explanation that doesn't even require malevolence from YT, just mawkish naivete about what people find most engaging, why, and what culpability they have in pushing it.

it's pretty good at matching people who seek garbage with garbage.
That may be your experience, but lots of people in this thread and elsewhere have pointed out some of the ways YT pushes the garbage, despite or in very orthogonal relationship to a user's search or watchlist history. Are they all "seeking garbage," or is it possible that your experience with the "service" and theirs are different for other reasons?

God forbid you watch a country song or a knife-sharpening video while logged into google - it's Jordan Peterson and Florida Georgia Line forever now.

(And yes, only watching YT from a browser in private mode is one good way to keep your feed unpolluted, but that is very much not how most YT content is consumed).
posted by aspersioncast at 5:12 AM on August 14, 2019 [4 favorites]


So many problems in society break down to “human beings are flawed and predictably behave in ways that they themselves would agree are flawed” and “we are unwilling to deal the idea of humans as flawed creatures” because that’s complicated and messy and it’s easier to just say “the issue/technology/law is amoral and humans have free will and just need to decide to be less flawed” and the problem fixes itself.
But isn't it reasonable to want to avoid that messiness? In philosophical sense I can admit that free will is mostly an illusion and that people are a product of their environment and so on, but I don't exist in an abstract philosophical world. People who want to push this view IRL in a political context are obvious enemies of anyone who cares about their intellectual freedoms. Why would any sane person want to admit this when it is just going to be used to advocate for further state control of political discourse/censorship and so on? How exactly would you support the Hong Kong protesters for example?
posted by L.P. Hatecraft at 5:25 AM on August 14, 2019


One other thing that I've been thinking about is the role of CDA Section 230 is in all of this. The text of it says that

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider"

This has been interpreted (I think) to mean that FB and YT should be treated more like the phone company or a fibre-optic networking provider when it comes to the content on their platforms. But, both FB and YT have been actually behaving a lot more like publishers than neutral platforms...


Section 230 doesn't contain a hard dichotomy between being a conduit and a publisher. It envisions that platforms will actually moderate and curate content. (I believe that the idea that being a publisher of information makes you liable for what that information does is not universally true--it's true in the common law of defamation, but there's all sorts of speech acts that can be illegal.) The section immediately following the above-quoted one says:
No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
The reason it's called CDA--the Communications Decency Act--is because it was part of a package that was supposed to get smut off the Internet by encouraging content filtering. In fact, the section's title is "Protection for private blocking and screening of offensive material." But it was *also* a reaction to cases like Stratton Oakmont v. Prodigy, where a court found Prodigy liable for defamation when one of its users wrote a post accusing Stratton Oakmont (the "Wolf of Wall Street" firm) of fraud. The court decided that, since Prodigy did *some* content moderation, it should have moderated enough to prevent the defamation. Congress disagreed, and , specifically because it wanted platforms to be better about getting rid of (perfectly legal) adult content and other "objectionable" content without courts then saying, "well, you removed all these female-resenting nipples, so why didn't you remove this thing defamatorily accusing Bigshot Exec of sexual assault?"

That doesn't immunize a platform from saying things itself, and there's an interesting question as to when curation becomes a sort of speech in itself. But 230 doesn't create an "immunity" so much as just say "you're not the one saying the stuff in the poster's post."

The reason that's important is the question of if the post itself is illegal. In many cases, vile stuff is completely legal because of the First Amendment. Meaning that getting rid of or substantially altering 230 doesn't do anything to change the platform's incentives.

Case law on 230 is already drawing the boundaries of when arrangement of users' speech is an additional speech act or other activity in itself; those things fall outside of its scope.
posted by pykrete jungle at 5:25 AM on August 14, 2019 [3 favorites]


Re algorithms tuned to addiction: this may be because they're calibrated for the obsessives. If the people going down a rabbit hole are the people exposed to/engaging with the most ads, they're the whales the algorithm wants. So the vast majority of people who don't want alt-right survivalists just because they wanted a damn tutorial on sharpening their kitchen knives get recommended that stuff anyway.

But see this article that talks about why there's these weird subgenres cropping up everywhere in porn: they're the ones that appeal to the small number of people who pay for it, while the larger mass of people just ignore the weirdness:
This “consumer” vs. “customer” division is key to understanding the use of data to perpetuate categories that seem peculiar to many people both inside and outside the industry. “We started partitioning this idea of consumers and customers a few years ago,” Adam Grayson, CFO of the legacy studio Evil Angel, told AVN. “It used to be a perfect one-to-one in our business, right? If somebody consumed your stuff, they paid for it. But now it’s probably 10,000 to one, or something.”

There’s an analogy to be made with US politics: political analysts refer to “what the people want,” when in fact a fraction of “the people” are registered voters, and of those, only a percentage show up and vote. Candidates often try to cater to that subset of “likely voters”— regardless of what the majority of the people want. In porn, it’s similar. You have the people (the consumers), the registered voters (the customers), and the actual people who vote (the customers who result in a conversion—a specific payment for a website subscription, a movie, or a scene). Porn companies, when trying to figure out what people want, focus on the customers who convert. It’s their tastes that set the tone for professionally produced content and the industry as a whole.
posted by pykrete jungle at 5:47 AM on August 14, 2019 [5 favorites]


I do feel like I've noticed a change in Youtube recommendations recently, though I can't remember when it started. For a while there, like aspersioncast said, I'd watch a single knife-sharpening video (or, god forbid, an MMA recap) and for the next couple of weeks my recommendations would be full of OWNING the LIBS!! and DESTROYING the FEMINISTS!!

Perhaps it's my recent splurge of Black gospel that has changed my recommendations, or perhaps Youtube has made a change; either way, I'm not seeing that stuff right now. Let's watch a Ben Shapiro at Liberty University video and see what happens next...
posted by clawsoon at 6:11 AM on August 14, 2019 [1 favorite]


My YT feed has very little objectionable[*] in it, and I occasionally watch knife sharpening videos (or at least that one japanese guy that makes knives out of jello and stuff).

Interestingly enough, the main youtube page for me right now has several features of note:
1) it had displayed a little light blue box with a video i'd recently watched, asking for a rating on the recommendation.
2) it has a bunch of recommended channels for me, and clicking the little grey X in the right corner causes it to say "Got it. We'll tune your recommendations." This does cause an XHR action back to google when you click it, sending a data blob back to '/service_ajax?name=feedbackEndpoint'. So, maybe they do something with that information (unlike Facebook, where clicking the "show me less of this" does actually nothing at all beyond temporarily hiding the element in your browser. it's a no-op and sends no data back.)

[*] scrolling all the way to the bottom of the recommendations does recommend Joe Rogan, though. I did my part and dismissed that recommendation, so hopefully the liberal side of youtube will now be slightly more biased away from Joe Rogan, and more toward Matchbox car restoration videos. (oh man, are those oddly soothing.)
posted by Xyanthilous P. Harrierstick at 6:34 AM on August 14, 2019


Result of Ben Shapiro experiment: Some Ben Shapiro, some Liberty University, and some Joe Rogan ("What Really Happened to Jeffrey Epstein?"), scattered amongst a bunch of other random stuff.

One dose of that was enough for today - I've heard all the arguments before, and they haven't gotten any more interesting - so I'll let somebody else continue the experiment.
posted by clawsoon at 6:47 AM on August 14, 2019


Metafilter: female-resenting nipples
posted by aspersioncast at 7:12 AM on August 14, 2019 [3 favorites]


Just to reinforce, I think the way you engage YT probably does have a lot to do with your recommendations, and that very much (perhaps even most importantly) includes your platform/browser.

Incidentally, my second "recommended" link right now is also Joe Rogan "what really happened to Jeffrey Epstein."
posted by aspersioncast at 7:19 AM on August 14, 2019


Fortunately, there's another side to us. We want to be better. We want to overcome our hate, our lassitude, our appetites. To recognize that, to provide us content based upon that, would be an acceptance of moral responsibility.

Youtube as a content platform cannot take moral responsibility but can definitely recommend a lecture or two hundred from influential Prof. Peterson expounding about moral responsibility and how the left is failing at it and destroying free speech.
posted by bitteschoen at 7:52 AM on August 14, 2019 [5 favorites]


I wonder how much keywords that appear during your browsing including those on this page effect the YouTube algorithm. It's all Google, tracking everything, so why wouldn't they use that data for YouTube recs. And it's not like we're advocating for right wing nonsense on Metafilter, just discussing the state of the world right now which inevitably means talking about right wing nonsense, but the keywords without context wouldn't look great.
posted by jason_steakums at 8:00 AM on August 14, 2019 [1 favorite]


I like reading the news google delivers to my phone. Lately, there have been several links to completely unreliable, lying, fascist sites. I can easily tell Google I don't want more of that, but they are famously difficult to communicate with, and no way to tell them they shouldn't be offering those links at all, to anyone.

The Google news they push to my phone is very echo chamber-y. Click one link to learn about the Yellowstone caldera, and you get inundated with nonsense for at least a year. They are generating and reinforcing dangerous ignorance. Also, I liked it when ads were labeled, but they don't do that in the news feed, sketchy AF. They have vast reserves of money; there's no excuse for their level of irresponsibility.
posted by theora55 at 8:01 AM on August 14, 2019 [5 favorites]


> I wonder how much keywords that appear during your browsing including those on this page effect the YouTube algorithm. It's all Google, tracking everything, so why wouldn't they use that data for YouTube recs. And it's not like we're advocating for right wing nonsense on Metafilter, just discussing the state of the world right now which inevitably means talking about right wing nonsense, but the keywords without context wouldn't look great.

well if that's the case i would like to say a few words:

karl marx capital das kapital communist manifesto rosa luxemburg antonio gramsci dsa democratic socialists of america socialism democratic socialism anarchosyndicalism anarchosocialism anarchism antifa murray bookchin international workers of the world iww gene sharp from dictatorship to democracy contrapoints natalie wynn....
posted by Reclusive Novelist Thomas Pynchon at 8:40 AM on August 14, 2019 [6 favorites]


The authors of the NYT article are doing an Ask Me Anything on Reddit right now.
posted by Nelson at 9:43 AM on August 14, 2019 [2 favorites]


Related: 'We can't reach the women who need us': the LGBT YouTubers suing the tech giant for discrimination.
The idea that YouTube might be discriminating against LGBT content creators is not new. In March 2017, #YouTubePartyIsOver was trending on Twitter after several prominent YouTubers complained that videos of same-sex couples exchanging vows and makeup tutorials for trans women were being age-restricted. In September 2017, the bisexual author and comedian Gabby Dunn tweeted that all the LGBTQAI content on her channel had been demonetised while the heterosexual content had not.
posted by adamvasco at 10:09 AM on August 14, 2019 [11 favorites]


Thanks, Nelson, that Max Fisher and Amanda Taub AMA is REALLY worth reading.
posted by PhineasGage at 10:20 AM on August 14, 2019


The problem is the UI/UX has no way to explicitly say “never show me this again”.

You can literally do this by clicking the 3 dots to the right of any video title on your recommendations. It lets you block the entire channel. You tell it "not interested" and then you can go further by clicking "tell us why" and you can tell them you don't want anything from that channel.
posted by Dark Messiah at 10:32 AM on August 14, 2019 [5 favorites]


Will Sommer: James O’Keefe’s Google ‘Whistleblower’ Loves QAnon, Accused ‘Zionists’ of Running the Government (emphasis mine)
Right-wing provocateur James O’Keefe published his latest video on tech giants on Wednesday, touting an interview with former YouTube software engineer and self-proclaimed “whistleblower” Zach Vorhies. In the video, Vorhies claims that Google’s search algorithms are riddled with political bias, and touted a cache of internal Google files he alleges prove his case.
[...]
What O’Keefe’s video leaves out, though, is that his much-hyped insider is not as credible as he claims. On social media, Vorhies is an avid promoter of anti-Semitic slanders that banks, the media, and the United States government are controlled by “Zionists.” He’s also pushed conspiracy theories like QAnon, Pizzagate, and the discredited claim that vaccines cause autism.
[...]
YouTube, where Vorhies worked until recently, has been criticized for a video recommendation algorithm that can push users towards more extreme content — and thus towards radicalization and conspiracy theories.

Vorhies himself saw YouTube as a reliable source of evidence for his conspiracy theories
, according to one tweet he sent promoting anti-vaccine paranoia. In a January tweet, he urged other Twitter users to look up anti-vaccine information on YouTube.

“Don’t take my word for it—study it for yourself,” Vorhies wrote. “See the testimony of countless parents testifying on social media (e.g. YouTube).”
posted by zombieflanders at 11:54 AM on August 14, 2019 [6 favorites]


That assumes that the problem with fascists that they've just been fed bad information. But is that really it? Perhaps I'm being naive, but it seems like it would take more than just believing certain falsehoods to turn me into someone who believes that minorities should be denied basic human rights.... If someone convinces you to be a racist via a YouTube video, that's not a technical malfunction or technical incompetency, it's a problem with the viewer.

I'd bet that in the majority of cases where this happens, it's a combination of things: there's potential bugs in the viewer's value system and evaluative systems, things that can be exploited by propagandists.

Apparently this is borne out in some of the social/psych research from Lakoff to Haidt. For example, disgust sensitivity is apparently a pretty good predictor of racism, and it's speculated that this might even be related to past-adaptive behavior; if you're a pre-columbian native american it sure looks reasonable in retrospect for you to regard european colonists as contaminated foreigners. Or high degrees of conscientiousness might well predict how likely you are to subscribe to a political conservatism that's essentially the fundamental attribution error writ large, whereas high degrees of empathy might mean you are less likely to do that.

The problem with focusing on "a problem with the viewer" is that whether you say people are "bad" or "buggy", there's only two solution classes. One is get rid of them (execution, imprisonment, or some other form of isolation/hobbling), the other is trying to change their nature. Neither of these approaches is trivial, and where the latter is possible it's essentially a matter of trying to change their inputs and environment... which means we're back to the matter of what YouTube feeds them, among other things.

Maybe there are ways to patch buggy/exploited people at a person-to-person level, but at the risk of falling into techbro talk, that doesn't scale well. The problem of fixing problems with the viewers kindof is fixing the problem with what they view, with their inputs. Patches that scale have to involve fixing the attack vector.



I saw was that the algorithm looks for patterns of This Video, Then That Video among users, and that this is incredibly easy for a bad actor to game by getting large numbers of users to watch a popular video followed by the bad content they're boosting, and that creates a connection for the algorithm to suggest to others. I'd hope that explanation is wrong and it's not that stupidly simple, because jesus that's dangerous.

Maybe a bunch of us need to watch some Jordan Peterson videos and then follow them with Jordan Peterson doesn't understand postmodernism or even The Alt Right Playbook?
posted by wildblueyonder at 12:26 PM on August 14, 2019 [3 favorites]


I don't have the time or the Photoshop skills do this so just everyoneuse your imagination: The YouTube logo with Woody Guthrie's "This machine kills fascists" slogan on it, except it reads "This machine makes fascists."
posted by mhum at 1:06 PM on August 14, 2019 [3 favorites]


> Maybe a bunch of us need to watch some Jordan Peterson videos and then follow them with Jordan Peterson doesn't understand postmodernism or even The Alt Right Playbook?

tell me why this can't be automated.

also tell me why i shouldn't think that cambridge analytica or the internet research agency or freelance nazis or whoever hasn't already automated it in the other direction.
posted by Reclusive Novelist Thomas Pynchon at 1:13 PM on August 14, 2019 [2 favorites]


also tell me why i shouldn't think that cambridge analytica or the internet research agency or freelance nazis or whoever hasn't already automated it in the other direction.

No proof, but it's likely been weaponized. Bad actors have figured out how to game the system so white nationalist propaganda is almost always a few clicks away.
posted by ryoshu at 2:43 PM on August 14, 2019 [6 favorites]


seriously everyone, admiral adama was right. never network the computers. not even once.
posted by Reclusive Novelist Thomas Pynchon at 2:49 PM on August 14, 2019 [5 favorites]


Which I think is the sad truth about YT: it's pretty good at matching people who seek garbage with garbage.

Nah I will also back up the idea that YouTube was at one point very aggressive about suggesting certain kinds of right-wing videos. Though I think it was less that one would get right-wing videos from any starting point than that the suggestions for any political video would skew right... and then (and this is sort of key) if you clicked on one or two you absolutely could not ever get rid of them.
posted by atoxyl at 4:44 PM on August 14, 2019 [2 favorites]


I continue to believe that "the algorithm" is fundamentally the battlefield, though. People did this, they used this platform as part of a deliberate political strategy. I'm not saying this to make any particular point about Google's moral responsibility, but to make one of political strategy - I think expecting the people who run the platforms to solve this is going to be at best a perpetual late-barn-door-closing.
posted by atoxyl at 5:06 PM on August 14, 2019 [2 favorites]


citing the algorithm as if it is some impartial piece of math is just so damn lazy. I'm pretty sure I could write a recommendation engine that doesn't look at tags or strings, only does numerical comparisons on various viewing metrics, is completely ignorant of context, and still ends up recommending 99% jordan peterson / white nationalist / gamergate videos.

It's the same problem as claiming your ex nihilo machine learning loan default predictor isn't racist, just because you trained it on racist historical loan default data.
posted by benzenedream at 6:08 PM on August 14, 2019 [3 favorites]




That assumes that the problem with fascists that they've just been fed bad information. But is that really it? Perhaps I'm being naive, but it seems like it would take more than just believing certain falsehoods to turn me into someone who believes that minorities should be denied basic human rights....

When I was 12 I read Gone With the Wind. I said to my mom, "Slavery was a good thing and it was terrible that they got rid of it! All those poor slaves abandoned to fend for themselves!" My mom looked at me in shock and set me straight. We weren't a conservative household by any means and my mom's best friend and business partner was black. But GWTW painted a picture of what slavery was that was more like on-site workers than the reality.

I shudder to think what would have happened if I had access to YouTube at that age.
posted by rednikki at 7:07 AM on August 16, 2019 [3 favorites]


« Older The Happiest Company in Tech   |   ghost (net) hunters Newer »


This thread has been archived and is closed to new comments