Peppa Pig Posts To MeFi
November 6, 2017 11:21 AM   Subscribe

James Bridle: Something is wrong on the internet
Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level.
posted by prefpara (9 comments total)

This post was deleted for the following reason: This does seem pretty closely related to this thread from a couple days ago; how about we let it be consolidated in there. -- cortex



 
We've somehow created a world in which "it's the algorithm" is a magic phrase that can be used to absolve one of any personal or corporate responsibility for one's actions, especially when paired with "we're making money." We don't accept that answer as an excuse for human intelligence, we shouldn't anyway, and it shouldn't suffice for artificial intelligence either.
posted by zachlipton at 11:29 AM on November 6, 2017 [2 favorites]




Everything is terrible, but I'm somehow tickled by the idea that, without any conscious decision on the part of a human, we've built a world where an automated system has been incentivized to show Peppa Pig dental torture videos to small children.

This fucking century.
posted by figurant at 11:33 AM on November 6, 2017 [3 favorites]


zachlipton: "It's the algorithm" is the new version of "The computer says no." A bullshit excuse used to deflect blame and avoid having to do actual work. What makes "It's the algorithm" more insidious is that it's spouted by the people who make the algorithms, while "The computer says no" is usually just some schlub in a customer service role with no ability to make changes.
posted by SansPoint at 11:35 AM on November 6, 2017 [3 favorites]


Couple this with the fact that Sting told us they love their children too, and the most recent relevations on all sorts of shady money, one foresees another level of evolution for the interconnectivity networks and whatever future version of the WWW that will support. As in within a year or two.
posted by infini at 11:41 AM on November 6, 2017


However, a huge part of my troubled response to this issue is that I have no idea how they can respond without shutting down the service itself, and most systems which resemble it. We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I’ve used in this essay.

I feel like the only workable solution to this is to rely on a gatekeeper model, where only hand-picked content gets in. Which means more corporate content. That depresses me.
posted by capricorn at 11:47 AM on November 6, 2017


otoh, it might give rise to a whole new category of service - hand curated artisanal feeds
posted by infini at 11:49 AM on November 6, 2017


are there actual traumatised kids here or is that question being begged? kids are a lot stronger and weirder than they're given credit for.
posted by Sebmojo at 11:52 AM on November 6, 2017 [2 favorites]


Sebmojo: Some kids can bounce back from seeing traumatizing stuff, but there's a bigger issue at play here. If there's enough people/bots that are able to exploit YouTube's algorithm on this scale, what else could they be doing? It's already known that YouTube's algorithm has tied video game-related videos with white supremacist videos, so a kid, or an adult, watching someone play Minecraft can very easily slip into videos promoting literal Nazism. Their algorithm is being gamed, hard, and nobody wants to bother fixing it.
posted by SansPoint at 12:05 PM on November 6, 2017 [1 favorite]


« Older Forget #vanlife, what about #nomadlife?   |   Trouble in Paradise Newer »


This thread has been archived and is closed to new comments