the invisible code
February 3, 2018 2:42 AM   Subscribe

'Fiction is outperforming reality': how YouTube's algorithm distorts truth An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos. Did they harm Hillary Clinton’s bid for the presidency?
posted by fearfulsymmetry (41 comments total) 22 users marked this as a favorite
 
Recent development: Yesterday, YouTube announced on its blog that it will now start identifying videos/channels funded in whole or in part by governments.

Predictably, a lot of the comments on that article are conspiratorial, but what's always interesting is that people's conspiracies generally argue that the left/liberal end of the political spectrum is privileged over everything else -- so even though as this article discusses, the data shows the exact opposite, people still think YouTube is a liberal SJW bastion in George Soros's pocket (or whatever).
posted by subversiveasset at 3:19 AM on February 3, 2018 [20 favorites]


In 2018 it is frightening but not shocking that we have a largely unmoderated AI at YouTube creating our hyper-reality for us.

And the (implied) excuse seems to be "Hey, we are just trying to make money, it is not on purpose that we are presenting this skewed and unhealthy shit on people." Oh, it is all in service to capitalism? Pardon me, wouldn't want to get in the way of the American Dream! Carry on!

When do we reach the tipping point? Will we have the chance to burn it all down, or is it already too late?
posted by Meatbomb at 4:01 AM on February 3, 2018 [28 favorites]


I just find the YouTube front page a really unsatisfactory human user experience. The Recommended list is never what I really want, like I prefer actual surprises and diversity and substantial content, but for example after watching 1 Friends clip the list gets completely invaded by more Friends clips, until I get fed up and manually intervene. It's analogous to junk food.
posted by polymodus at 4:10 AM on February 3, 2018 [22 favorites]


I've actually learned to open certain or unknown YT links in private windows just to keep my "recommend" videos more in line with what I want. All it takes is accidentally clicking on some stupid viral or vine comp video and then that crap dominates my recs for weeks or months.

I also use alternate accounts focused on, say, music research to keep the recommendations focused on music. The subscriptions and recommended videos on my music-focused account is a glorious, beautiful thing that's completely lacking shitty viral videos and memes, and 99.9% of the recommendations are music.
posted by loquacious at 4:24 AM on February 3, 2018 [20 favorites]


It's a self-reinforcing bubble, you click on one and it rabbit-holes you into more of the same. It's a good litmus test though, looking at someone's recommended list is like reading the spines on their bookshelf. I have a gamer friend and his was recommending Alex Jones and a bunch of gun porn videos.

Mine recommends TED talks, numberphile, robotics, Simone Giertz, music. Then sometimes it completely misreads the room, like this time I screencapped when it went from dressmaking to Chomsky on Al-Jazeera.

That kind of random Noam leads me to think they already know which way I'd vote and can only reinforce that. Did that hurt Hillary? I'm going to say yes, because the slow drip of informed and reasonable content is overwhelmed the firehose of crazy bullshit.
posted by adept256 at 4:26 AM on February 3, 2018 [9 favorites]


That kind of random Noam leads me to think they already know which way I'd vote and can only reinforce that.

It's trying to recommend content that is either similar to things you've watched or content that people who have watched similar things to you have watched. From the perspective of the algorithm, which way you'd vote does not enter into it.* However, it's not hard at all to imagine (and is almost certainly the case) that the videos you've watched in the past (and things you've searched for and then clicked in the Google search results) amount to a proxy for your political opinions (and your gender and a number of other things).

I really wish it were possible to have a discussion about unintended consequences of machine learning without the sort of weird scare-mongering of the article about how we're powerless at the hands of these mysterious algorithms.

*Yes, technically, it's probably possible to predict and use as a feature, at least in the US. But that's probably not the case. Gender, on the other hand, may well be a feature--it's sort of the notorious example of something that's either readily available or easy to predict, so people start using it, but means you end up reinforcing some questionable social phenomena.
posted by hoyland at 5:33 AM on February 3, 2018 [5 favorites]


Has You Tube stopped de-monetizing video game channels?
posted by Beholder at 5:39 AM on February 3, 2018


Related: The weird world of seriously nightmare-inducing Kids' videos, produced en masse for no discernible motive have been flooding YouTube and taking advantage of the awful related videos algorithm. (CW: That article is difficult to read at points -- none of the usual triggers, but evokes an existential sense of dread throughout because of how needlessly screwed up it all is)
posted by schmod at 5:43 AM on February 3, 2018 [10 favorites]


“On YouTube, fiction is outperforming reality,” Chaslot says.

In other words, talk radio. This isn't surprising in the least. If YouTube is obsessed with viewing time, and not just clicks, of course they're going to cater to conspiracies, vulgarity, exploitation, and all the usual crud which pollutes daytime talk shows, call in radio, and reality tv. The only new aspect to any of this is the political impact. Otherwise, it's just a modern twist on if it bleeds, it leads.
posted by Beholder at 5:57 AM on February 3, 2018 [11 favorites]


If you want to despair, Google search something like "Dachau liberation" and see how many results there are before a Holocaust denial or other Nazi site
posted by thelonius at 6:10 AM on February 3, 2018 [4 favorites]


Has You Tube stopped de-monetizing video game channels?

Sort of. My videos will get tagged with the "not suitable for all advertisers" tag pretty frequently, but it seems to be having less effect on the actual revenue than it had before and it often gets removed within a day or two even if I don't request review. Of course, that first day or two is by a huge margin the most profitable period for most videos. (To be clear, my videos are mostly strategy games or me and my partner playing board games together, so maybe a little boring but not the kind of stuff advertisers should be afraid of having their name next to.)

It's trying to recommend content that is either similar to things you've watched or content that people who have watched similar things to you have watched.

The latter point especially is a lot of where these self-reinforcing event loops come from. These recommendation engines use the watching habits of other people who watch the stuff you watch as a way of avoiding the programatically tricky (impossible?) problem of having to figure out what the content is actually about. Obviously Youtube can't look at your video to know what it is or what it's about, and they can't necessarily trust the text or tags that are entered alongside the videos because they know that the uploader is potentially a bad actor, so they lean very heavily on "you watched X and many other people who watched X also watched Y, so you will want to watch Y", which creates these weird reinforcing loops because in many cases the only reason that people watched Y is because Y was the only thing Youtube was offering them. The problem with the "satisfaction" metric Google talks about in the article is that the people who are most motivated to seek out certain kinds of content end up having an outsized effect on what everybody sees as recommendations, which means basically that the recommendations system is "poisoned" by the people who love things the most and the people who hate things the most, and you can go ahead and guess which of those groups is searching for reinforcement with the most frenzied vigor.
(As a side note, Youtube is trying a little bit now at programatically figuring out the content of videos by attempting to parse the text of the automated speech-to-text subtitles, but unfortunately the engine that creates those subtitles is still doing a pretty poor job so they don't have good data to work with. I'm skeptical of whether this system would actually be very helpful in fighting misinformation, though, which can be very complicated to identify, or if Youtube would even attempt to use it in a helpful way in the first place.)
I don't know if the problem of machine-learning content recommendation is actually impossible, but I do know that our current solutions are making a lot of real-world problems much worse. I would say that there need to be more people involved in the process, but they'd have to be carefully vetted not to be people who are intentionally going to create the kind of effect that is right now being created accidentally, which is a thing we have evidence that Google is not good at, and then that's exacerbated by the fact that there would have to be a LOT of them, because Youtube's volume problem is real.
posted by IAmUnaware at 6:23 AM on February 3, 2018 [7 favorites]


I'm youtube addict. For about three weeks last month, I swear it had finally figured me out. It was only slightly frightening, mostly I was just about prepared to submit to the will of the newly-sentient Tube if it came to that. It consistently recommended interesting things that were in the same vein as stuff that I like, but not exactly the same as what I just watched. For instance, it knows I sometimes enjoy ridiculous British comedy panel shows, and it actually managed to introduce me to several I hadn't seen before. Not that all of them were good, but I was at least glad to have the opportunity to try watching a great many of the things it recommended. Often there would be something delightfully unexpected standing out in the suggested links, such as this video from a train going across Norway set to Moby.

Then one day I clicked on one too many very popular cat videos or something, and it went instantly back to its old ways. It suggests I probably want to watch more of exactly what I just saw, or else hey, you like popular things right? Well then you'll love this: It's a video of an atheist explaining why he's right and everyone who disagrees is stupid.
posted by sfenders at 6:28 AM on February 3, 2018 [8 favorites]


I see the problem is Youtube giving people what they want. To the extent that is a problem. Clickbait, bullshit conspiracy, gossip, etc gets people coming for more. And due to what they clicked, they get more.

It's an indication that consumers are still woefully ignorant about this newfangled internet stuff. Basically, a whole lotta computer users are still in the stage when the supermarket tabloid in streaming video form is legitimate news. Maybe they'll never get out of that stage.

I don't know wow you de-stupidify people who want to consume shit. Identifying video funded by governments? I don't see how this can possibly deter any government determined to fuck with Youtube bullshit consumption.

My Youtube feed is all non political stuff that I'm into. The only time anything political pops up is when there's a big news event, and I get several recommendations of the same event under "News" from generally legitimate news sources. Even though I always click "Not interested" in news recommendations, they curiously seem to persist, because Youtube must think I Need To Know. The more puzzling thing about I find is how often the AI misses with its recommendations, overlooking videos that are clearly up my alley, by the same users, on the same topics.
posted by 2N2222 at 6:45 AM on February 3, 2018 [5 favorites]


I don't know wow you de-stupidify people who want to consume shit.

I suspect that there's something more here -- the internet has radically changed how we consume information, and we haven't developed systems to deal with that. It's like people have gossiped since forever, but in the 17th C in Europe, cheaper printing codified gossip into news which changed how people consumed that information. And there was a good 200 years or more of changes in journalism which standardized delivery methods and evaluation criteria, and then the internet poked a hole in it, and we may have another couple of centuries to figure out how to navigate this.

Or, for another metaphor, it took cultures a long time to learn how to deal with drugs, alcohol, tobacco, etc, and memes and internet info are another drug, and they are going to kill a lot of people until society (and maybe the gene pool) adapts to the new situation. It's not stupidity,necessarily.
posted by GenjiandProust at 7:07 AM on February 3, 2018 [11 favorites]


My YouTube feed is all synth demos (particulary Eurorack). Every once in a while it tries something like basketball or fishing or hip-hop or a specific 80s band, or something else I have very little interest in, just to see what happens. Sometimes it's recognizably related to a video I watched once, usually not.
posted by Foosnark at 7:09 AM on February 3, 2018 [1 favorite]


Mine recommends TED talks, numberphile, robotics, Simone Giertz, music.

Mostly episodes of Time Team and country music here. A sprinkling of documentary and woodworking videos, too. My kids get video games and grime or video games and classical music (depending on child), my dad gets cars crashing into things in Russia. This appears to be an algorithm that's actually working; people are being offered a subset that they're likely to select from. So what's the alternative? What would serve people's needs better? Because youtube's video archive is growing at 300 hours a minute, and I can't see a better way to navigate that than to have the system push the things you are most likely to watch to the top of the pile.
posted by Leon at 7:40 AM on February 3, 2018


“On YouTube, fiction is outperforming reality,” Chaslot says.

In other words, talk radio.


One of the things that suggested videos do is it keeps the flow going. I don't know if my use patterns are typical, but go back a few years, and my interactions with YouTube were one-and-done. It wasn't the sort of thing that, autoplay or not, I'd browse for hours, which I find myself doing on some lazy evenings now (panel shows, cartoon clips, etc).

Lots of people watch TV that way--as background noise--but that's basically what radio is designed to be. At some point, radio became the thing you could do while doing something else--driving, working a service job, etc.--and it can be engineered to proceed smoothly and constantly, over the course of hours and hours. I used to think that that was because it could be background and not pull attention, but now I'm not sure.
posted by pykrete jungle at 7:43 AM on February 3, 2018 [3 favorites]


I suspect that there's something more here -- the internet has radically changed how we consume information, and we haven't developed systems to deal with that.

Visual input is important to humans. Or so enough research exists that Facebook and the *chans of the world push taking simple text and either making it a whole picture or as a border around some other picture.

Visual input is seemed important enough to have people make youtube channels out of READING you what they are seeing on a web page. No commentary, no analysis. Just reading.

Trusting your eyes will be hard for humans to break, if it is at all possible.
posted by rough ashlar at 8:05 AM on February 3, 2018 [4 favorites]


Many months ago, I watched a video of a Steven Pinker lecture. I guess he's become a darling of MRA and Dawkins/Harris-esque atheist types in recent times and ever since my recommendations have been peppered with "WHY FEMINISTS ARE WRONG ABOUT SCIENCE AND BAD AT MATH" type videos.
posted by treepour at 8:30 AM on February 3, 2018 [5 favorites]


Wow, there’s some important stuff to be dug in to here but this article really is the worst possible way to do it. Is the Guardian really gonna try and hang an entire piece on “we talked to a guy who worked at youtube for three months, five years ago, and then got fired for performance issues”?
posted by Itaxpica at 8:57 AM on February 3, 2018 [2 favorites]


Have largely avoided the Echo Chamber by keeping most of my searches technical and arcane.

But that one time I searched a particularly popular Oprah Book of the Month Selection, (because I'd just met the author) ... damn, that search-pollution took months to die down, and most of it was for scam counterfeit knock-offs.

Although I still sometimes get these weird sites that seem to have been nonsensically created on the fly out of my search terms.
posted by StickyCarpet at 8:59 AM on February 3, 2018 [2 favorites]


Clear your YouTube play history and search history. Once you see what YouTube recommends to a blank slate, it’s no wonder the whole place is fucked.
posted by Sys Rq at 9:01 AM on February 3, 2018 [10 favorites]


The sample we had looked at suggested Chaslot’s conclusion was correct: YouTube was six times more likely to recommend videos that aided Trump than his adversary.

Though the obvious follow-up question is whether there's six times more pro-Trump videos uploaded to the site. I'm not saying that for sure the correct behaviour is to recommend them just because they're there, but I wonder. Like if there's a certain bias in the sort of person who uploads a video to Youtube to support their political position. The article does later have some folks who say their Clinton related videos were surprisingly popular compared to most of their videos. Or the bit with John Kelly (no relation) where he says the videos were promoted by Twitter bots he was already suspicious of.

The bit about videos being taken down by their uploaders is interesting, too. Like, three million people saw this video, and there's no longer any record of what it was?

I admit I kind of want more follow-up with the "part-time conspiracy theorists." "Most of your videos had a handful of viewers, yet your video about Clinton suddenly gets millions of views. How do you explain that? Coincidence?"
posted by RobotHero at 9:44 AM on February 3, 2018 [2 favorites]


This may be just a language issue, but the internet doesn't KNOW what I WANT.
posted by njohnson23 at 10:24 AM on February 3, 2018 [4 favorites]


Hm. Besides the videos I make, I mostly use YouTube for music, and the recommendations are decent. They are often iterations of stuff I've listened to before (different Wagner performances, death metal covers), or reasonable pointers to related sounds, not to mention some straight up repetitions. The "My Mix" is a condensed version of this.
posted by doctornemo at 11:17 AM on February 3, 2018


Has anyone else noticed the really strong correlation between someone peddling a nonsense idea and having only youtube videos as support?

I think it's because being persuasive is orthogonal to being correct, and video is more easily persuasive than writing.
posted by flaterik at 12:34 PM on February 3, 2018 [7 favorites]


Related: The weird world of seriously nightmare-inducing Kids' videos, produced en masse for no discernible motive have been flooding YouTube and taking advantage of the awful related videos algorithm.

It's so hard for me to find a place to stand on this whole thing, because for a fucking year at least this was batted away by nearly everyone on the planet as a "conspiracy theory." It's been called "Elsagate" for fucks sake. Part of the reasons conspiracies are being promoted is because we have a world that wants to dismiss anything that doesn't immediately fit in the picture, and people making weird, surreal, dark videos that are aimed at children was something people couldn't fit in their picture, so it took until the shit hit critical fucking mass of people noticing it before it stopped being called a "conspiracy theory" and journalists started actually writing about it.

Thus, the entire problem with knee-jerk labeling anything "conspiracy theory." Because sometimes you're dismissing legitimate issues because they don't fit your preconceived notions about the damn world. YouTube isn't deleting these videos en masse over nothing. Something fucky was and is going on with this, and plenty of people wanted to just brush it away as a "nothingburger" for a long fucking time.
posted by deadaluspark at 1:25 PM on February 3, 2018 [4 favorites]


Teegeeack, I also noticed it a lot during the weird resurgence of flat earters, and the other day a BTC True Believer who just needed to "find some good tubes" to convince us all that he was right and we were ignorant for not believing him.
posted by flaterik at 1:37 PM on February 3, 2018 [2 favorites]


strong correlation between someone peddling a nonsense idea and having only youtube videos as support?

Yeah, and that's why I'm wondering how much is it more likely to show you a pro-Trump video because pro-Trump people are the sorts of people who make Youtube videos?
posted by RobotHero at 1:48 PM on February 3, 2018


METAFILTER: so even though as this article discusses, the data shows the exact opposite, people still think
posted by philip-random at 1:53 PM on February 3, 2018 [1 favorite]


Is the Guardian really gonna try and hang an entire piece on “we talked to a guy who worked at youtube for three months, five years ago, and then got fired for performance issues”?

It's not just "here's what I saw when I worked there." TFA details how the engineer measured the algorithm, using software that a third party said was a straightforward way to do it.

I found it telling that Google was all "No puppet! No puppet! The Guardian's the puppet!" until the Senate hearing. Then they were all "Yeah, we did that."
posted by Johnny Wallflower at 2:50 PM on February 3, 2018 [1 favorite]



I suspect that there's something more here -- the internet has radically changed how we consume information, and we haven't developed systems to deal with that. It's like people have gossiped since forever, but in the 17th C in Europe, cheaper printing codified gossip into news which changed how people consumed that information. And there was a good 200 years or more of changes in journalism which standardized delivery methods and evaluation criteria, and then the internet poked a hole in it, and we may have another couple of centuries to figure out how to navigate this.


That's exactly my point. Youtube is the new checkout stand paper rack. Many people who are immune to the Weekly World News Bat Boy headlines are all gee whiz over this newfangled internet video stuff. Now, it may not be stupidity. But when the result is indistinguishable from stupidity, I'm ok with calling it just that.

It's so hard for me to find a place to stand on this whole thing, because for a fucking year at least this was batted away by nearly everyone on the planet as a "conspiracy theory." It's been called "Elsagate" for fucks sake. Part of the reasons conspiracies are being promoted is because we have a world that wants to dismiss anything that doesn't immediately fit in the picture, and people making weird, surreal, dark videos that are aimed at children was something people couldn't fit in their picture, so it took until the shit hit critical fucking mass of people noticing it before it stopped being called a "conspiracy theory" and journalists started actually writing about it.

Indeed. It's difficult because things like "Elsagate" really is a nothingburger as far as I can tell. The "you" in Youtube means there's a conspiracy for you and everyone else's tastes, I guess.

Has anyone else noticed the really strong correlation between someone peddling a nonsense idea and having only youtube videos as support?

I have noticed that. I like following some quack and loon trends, and Youtube is the modern version of the third generation xeroxed typed newsletter tract that I used to come across back in the day. In this age when computers, phones and cameras are soooo cheap, and an extremely user friendly medium to distribute, anyone can be a news source. This has been hashed out many a time here and elsewhere. Youtube is a blessing. The bad thing about Youtube is that it's not really in a position to vet and curate all the videos for accuracy and offensiveness for everyone. The good thing about Youtube is that it's not really in a position to vet and curate all the videos for everyone.

It's a pity, Youtube videos are not nearly as fun to collect as leaflets and newsletters.
posted by 2N2222 at 5:24 PM on February 3, 2018 [2 favorites]


Identifying video funded by governments? I don't see how this can possibly deter any government determined to fuck with Youtube bullshit consumption.

I also remain quite unconvinced that video funded by governments accounts for as much a share of the issue as Americans organically convincing themselves and others of nutty things.
posted by atoxyl at 6:15 PM on February 3, 2018 [2 favorites]


But I've talked before about how I do find YouTube to be the scariest of social media sites - it's very much dominated by the Right and frequently used by children!

Why it's dominated by the Right I've only half-formed hypotheses about but I definitely thought it had something to do with what kind of people are able to engage for an extended period with guys filming themselves ranting in cars, you know?
posted by atoxyl at 6:20 PM on February 3, 2018 [2 favorites]


"Has anyone else noticed the really strong correlation between someone peddling a nonsense idea and having only youtube videos as support?"
NO ACTUALLY EINSTEIN KNEW THIS TOO
posted by I'm always feeling, Blue at 7:13 PM on February 3, 2018 [1 favorite]


I've actually started permitting or at least stopped obviating YT recommendations in recent years. The weird, frustating part is how heavily it re-recommends videos or channels I've already watched. Suspect that that has to be related to the fact I use it heavily for music listening (i.e. dog returning to own vomit style content) yet even so it's bloody annoying. The very few clicks I do give to suggested content are no doubt directly in line with what the algorithms thought I'd do when they snooped upon my email anyway so whatayagunnado ¯\_(ツ)_/¯
posted by I'm always feeling, Blue at 7:22 PM on February 3, 2018 [1 favorite]


And because I feel zero shame at commenting three whole times in a row nor quoting myself:

Metafilter: NO ACTUALLY EINSTEIN KNEW THIS TOO
posted by I'm always feeling, Blue at 7:25 PM on February 3, 2018


Paul Ewald points out in The Evolution of Infectious Disease that the faster a pathogen can jump from one host to the next (i.e., the more contagious it is), the more virulent it can get away with being.

And since the Internet offers faster transmission of ideas than we've ever seen before, it's more rife with pernicious ideas than anything we've seen before.
posted by jamjam at 7:31 PM on February 3, 2018 [8 favorites]


I don't know how well this would work for everyone, but a couple of years ago logistics demanded that I hang out in front of the TV and social media on my smartphone wasn't enough, so I really started using YT for the first time, not just dipping in to one-and-done incidental watching but subscribing to a lot of channels -- Casey Neistat, Mark Kermode, Chris Stuckmann are all favorites, but with a lot of "weird fact of the day" examples (Tom Scott, etc.) and various techy-sciencey channels. It's been pretty happy experience overall in terms of the recommendations I get although it is largely from the channels I have already watched, with more intuitive things from my channels' friends less often than I'd like. The biggest problem was that for a long time every time I'd watch Anita Sarkeesian, I'd end up with several videos recommended that were from her critics, especially S****n of A***d, and I had to be really careful to avoid watching them because then my reccos would be seriously fscked for days afterward.

Lately I was annoyed that the app's recommended tabs changed, eliminating LIVE and one other -- TECH or SCIENCE maybe -- and replacing those with BEAUTY and something else pretty trivial from my viewpoint.

I've also been clearly affected in this channel ecosystem by decreased content production due to the adpocalypse -- though the top-shelf brand ads seem to be gradually returning to some places that I had missed them. I'm good about sitting through ads for channels I really like, particularly the one-person shop type.

The one complaint I have regularly is that -- well, first, that the app no longer lets me browse my own subscribed content in an organized way, but also that the suggested ersatz "channels" like breaking news are usually all just one thing, say the Nunes Memo, and all from at least six-twelve hours back, which is dubious to label "breaking" anyway. I have a few searches I use but the app on my device isn't very easy to use that way. Ultimately I guess a split between low user-led discoverability (what I have now) and high, weird af discoverability (stuff described above) doesn't have much middle ground or obvious directability like you'd want -- features like the web version of Google News offers, to use an actual in-house example.
posted by dhartung at 12:05 AM on February 4, 2018 [1 favorite]


Labeling videos made by governments is interesting, considering so many people shared videos by RT that were obvious propaganda, knowing it came from the Russian government, and did not care at all. I fought with loads of people peddling pro-Assad propaganda on RT, lots of stuff about how certain Syrian activists didn't exist, how aid organizations were the real terrorists, etc. It was ridiculous. My favorite was one involving a "Canadian journalist" in a court room "destroying" a "mainstream reporter" about the Assad regime (or something along those lines). When I went and looked up who this "Canadian journalist" was it was just somebody who had a blog at one point in time, along with some articles posted in random websites, none of which were significant news sources.

People simply don't care. You can tell somebody that a video is coming with an obvious slant from the Russian government and they won't see what is wrong with that. It's their underlying beliefs that are the "issue" here, and the lack of critical thinking skills and investigative abilities. When I see sketchy things online I go and look them up and try to find as much information about them as possible, from legitimate-looking sources. I apparently have a preternatural sense of what is bullshit and what is not, so I can tell what websites I should trust. When it comes to propaganda videos, especially ones created by the Russian government to push a pro-Assad message, created in the same vein as popular meme videos (like what you'd sometimes find on The Dodo), I do some research and figure out what the deal is. I guess other people just don't do this.
posted by gucci mane at 4:05 PM on February 4, 2018 [3 favorites]


I remember that Canadian "journalist" video.

The problem is that a lot of people view not having any relationship to a major news source as a *positive*.
posted by flaterik at 6:02 PM on February 5, 2018


« Older Un Pueblo De Nada   |   “An ideal demo walks a fine line between... Newer »


This thread has been archived and is closed to new comments