Not Advertiser Friendly
August 29, 2017 5:17 PM   Subscribe

 
I realize it's easier said than done, but it really can't be said enough:

Don't base your livelihood on a platform that someone else controls.
posted by Mr.Encyclopedia at 5:24 PM on August 29, 2017 [24 favorites]


I've noticed YT is getting more aggressive with their ads too. The internet is growing up, no more free stuff.
posted by Bee'sWing at 5:26 PM on August 29, 2017 [1 favorite]


Is there a single article on this I can read, instead of twenty plus YouTube videos?
posted by mephron at 5:30 PM on August 29, 2017 [74 favorites]


Welp, the old adage 'know your customer' never stopped applying, and I imagine most Youtubers have no idea who their customers are other than 'Google?' or 'viewers.'
posted by pwnguin at 5:30 PM on August 29, 2017 [1 favorite]


Given how worthless Google search has become, this development surprises me not one bit.
posted by Thorzdad at 5:38 PM on August 29, 2017 [13 favorites]


Part of the problem is Google's inability to communicate what is going on and why, especially when it comes to specific videos being demonetized. But these articles should provide you with a background:

YouTube ‘demonetization,’ explained for normals

I can’t trust YouTube any more': creators speak out in Google advertising row

New icons are YouTube’s latest way to alert creators of video demonetization
posted by Foci for Analysis at 5:38 PM on August 29, 2017 [5 favorites]


I submit that many of the YouTubers with large subscription bases and plenty of likes and views etc. know their customers very well, which is why they tend to hew to their shtick.
posted by turbid dahlia at 5:38 PM on August 29, 2017 [4 favorites]


Der Wienerschnitzel needs to update their twitter logo, dammit
posted by NoMich at 5:40 PM on August 29, 2017


Mr.Encyclopedia: Don't base your livelihood on a platform that someone else controls.

I'm not sure how any of us could do this, no matter what job we have. A person busking on the sidewalk can be kicked off by a bylaw officer; Bill Gates' platform is Intel. For every one of us, somebody else controls something that's critical to our livelihood. That's just how it is. Like John Donne said, "No man is an island."
posted by clawsoon at 5:54 PM on August 29, 2017 [69 favorites]


This is a consequence of trying to kick off white supremacists and other alt right off Youtube. Google hates using actual people to screen video (and given the volume, it's pretty unlikely they could use actual people) and the outcry had advertisers drop their advertising off Youtube.
posted by zabuni at 6:06 PM on August 29, 2017 [4 favorites]


Are they actually managing to block Nazis or are they, as I would expect, managing to algorithm a bunch of other content out of existence while leaving fascists untouched?

(are they Twitter?, basically)
posted by Artw at 6:10 PM on August 29, 2017 [11 favorites]


zabuni: given the volume, it's pretty unlikely they could use actual people

That's an argument I'm increasingly less convinced by. They made $20 billion in profit last year. They could hire a lot of content reviewers with some of that money if they wanted to or were forced to.
posted by clawsoon at 6:14 PM on August 29, 2017 [17 favorites]


My perspective on this comes from being a high school teacher and in the past three years I've had an extraordinary amount of students tell me their post high school career plan was to start a channel where they would talk about video games or cards or some shit and make millions. Does this mean now they should probably do their homework?
posted by yes I said yes I will Yes at 6:14 PM on August 29, 2017 [52 favorites]


They could hire a lot of content reviewers with some of that money

How many people do you have to hire to screen a billion videos?
posted by straight at 6:18 PM on August 29, 2017 [4 favorites]


Not only a billion videos, but a billion hour long videos. This is not something throwing humans at will solve.
posted by 922257033c4a0f3cecdbd819a46d626999d1af4a at 6:22 PM on August 29, 2017 [5 favorites]


straight: How many people do you have to hire to screen a billion videos?

If they used half their profit, that works out to $10 per video. I suspect that you could find more than enough screeners at that rate of pay.
posted by clawsoon at 6:22 PM on August 29, 2017 [6 favorites]


When your gigantic company buys saturation bulk ads, you don't get to dictate where those ads appear.

If you want your ads in certain content, but not other certain content, just target buys.

"Advertiser Friendly" has never been a thing in bulk buys....
posted by CrowGoat at 6:27 PM on August 29, 2017 [1 favorite]


And how many managers do you have to hire to train and supervise that many screeners? How many middle-managers do you have to hire to train and supervise that army of managers? I'm asking earnestly -- my knee-jerk guess is that this would be a project of unprecedented size. Or are there other companies that have hired enough people to review a billion documents?
posted by straight at 6:28 PM on August 29, 2017 [4 favorites]


I've noticed YT is getting more aggressive with their ads too. The internet is growing up, no more free stuff.
posted by Bee'sWing at 5:26 PM on August 29 [+] [!]


the opposite. the internet is regressing, and becoming like television.
posted by eustatic at 6:37 PM on August 29, 2017 [16 favorites]


One of the most frustrating parts of working with Google is that there is never, under any circumstances, a human being you can talk to about anything.

A bit back I was putting CC-BY music from the FMA behind my videos. My videos, in turn, were licensed CC-BY. Nothing was monetized. Suddenly, one day, I discover that ads are appearing in front of some of my videos. It turns out there are a bunch of firms that essentially scam artists into enforcing rights for them, and then turn around and just scan youtube for the music and report every instance as a violation, regardless of whether it's being used properly.

What youtube does in these cases is they automatically change the license that you have put on your video, turn on ads, and funnel the ad revenue to the reporting firm. All without notification or consent.

Of course I could fight this, but while I think Brazil is an amazing movie I don't really want to do it as cosplay. So instead I just blanked the audio track and reverted the license. Now my videos are silent, or use public domain soundtracks, or I autogenerate a soundtrack in bash, or some bullshit.

The point is that Google does shit like this constantly, and it's not because there's no other way to do it. It's because Google has an internal culture that strongly believes that legal and ethical systems can be encoded and automated. Having a valid need to discuss an issue with an actual person, from that perspective, is an admission of failure. And boy howdy do those folks have issues with failure.
posted by phooky at 6:40 PM on August 29, 2017 [88 favorites]


the opposite. the internet is regressing, and becoming like television.

The internet is not YouTube. The internet is not Google.

Big names are big, but they are not all.
posted by JHarris at 6:43 PM on August 29, 2017 [12 favorites]


A more exact calculation: 300 hours of video are uploaded to YouTube every minute. That's an 18,000:1 ratio. Let's say you have 4 six-hour shifts; if I'm doing the math correctly (and please correct me if I'm not), that's a workforce of about 80,000 to watch each and every minute of newly uploaded video.

Google already has 70,000 employees. Walmart and McDonalds both have over 2 million employees. Foxconn has a single factory complex with somewhere between 200,000 and 400,000 employees. An 80,000-person workforce is not ridiculous or impossible. Double that so that they can cover the backlog; still not ridiculous.

And it's already happening. Right now, video content reviewers are mostly tasked with reviewing the most horrible videos that humanity has to offer up, so there's a high turnover rate. But if the workforce was increased so that all videos could be covered, most of the reviewers would spend most of their time watching disembodied hands opening Kinder Eggs and the like. Whether that would increase retention rates... hmm. Dunno.
posted by clawsoon at 6:44 PM on August 29, 2017 [11 favorites]


Moving fucking targets. Mix a cloaked algorithm that no one understands (and is protected IP) with the fickle tweaks the wetware make and you get one gigantic question mark on what actually works, what's appropriate and what is going to get you banished to the lower rung of hell.

Just one more thing you FTE's can thumb your nose at until you get shitcanned and have to depend on the gig economy to scrape by. The next 10 years are going to be full of schadenfreude by those of us that have been getting fucked for the last 15 years by google, craigslist and the online ad networks as the bourgeois fucks struggle to pay their mortgages with clicks.
posted by photoslob at 6:45 PM on August 29, 2017 [4 favorites]


Apparently 300 hours of video are uploaded every minute. That means 225 reviewers needed every minute, or over 32,000 per day. Assuming a nominal $10/hr wage, that's near 1.2$B pa in gross alone. Call it 2.5 to 3B pa to actually do in a real way.

YouTube payout to creators was around 1B in the same year---see the link previous.

So full time human monitors might be anywhere from 100% to 300% of what YouTube pays for the content.
posted by bonehead at 6:46 PM on August 29, 2017 [4 favorites]


They could just try looking at the Nazis? Mad random suggestion I know.
posted by Artw at 6:49 PM on August 29, 2017 [8 favorites]


This is not something throwing humans at will solve.

Apparently, this is also not something devs throwing nifty algorithms at will solve, either.
posted by Thorzdad at 6:54 PM on August 29, 2017 [16 favorites]


I like the framing of the post, BTW. Perfect, given the topic and platform.
posted by clawsoon at 6:59 PM on August 29, 2017 [4 favorites]


phooey: One of the most frustrating parts of working with Google is that there is never, under any circumstances, a human being you can talk to about anything.

I must offer a slight correction. Apologies in advance if you were deploying hyperbole.

When one subscribes to Google Apps for Business -- or whatever they're calling it these days -- you can, and I have, called them up to get genuine human help with any and every Google product currently in the wild. The help is rapid, courteous and useful.

So, the circumstance under which Google will offer you human support is uncomplicated: it's when you pay them.

It's not that Google is a weird futuristic company that forgot what customer service is; it may be, I suggest without malice, that you're a weird futuristic consumer who forgot what a free sample is.
posted by Construction Concern at 7:05 PM on August 29, 2017 [47 favorites]


Some of the linked videos make an interesting point: By the words of the Youtube policy, a whole lot of extremely popular sexually suggestive/explicit music videos by major pop stars are "not advertiser friendly." Somehow, though, it seems unlikely that Youtube will demonetize the Justin Biebers and Nicki Minajs, despite their policy being clear on that point.
posted by clawsoon at 7:17 PM on August 29, 2017 [8 favorites]


Apparently 300 hours of video are uploaded every minute. That means 225 reviewers needed every minute, or over 32,000 per day. Assuming a nominal $10/hr wage, that's near 1.2$B pa in gross alone. Call it 2.5 to 3B pa to actually do in a real way.

This whole analysis is focusing on the wrong thing. Having people review every minute of every video is overkill. What Google needs is a much smaller team of people available to deal with problems as they come up. They don't have to throw away their algorithms, just augment them.
posted by panic at 7:19 PM on August 29, 2017 [20 favorites]


Construction Concern: If so, that's an improvement; we used Google Apps for our business six or seven years ago and I remember it being a bit of a mess. (In particular, there was no support for organizational structures, and you were either an admin or not, which meant that if you gave your IT person the ability to provision email for a new hire they could also, say, read the CEO's mail.) To be fair, we never tried to actually call them, but I don't think in the 21st century feature requests should be submitted via voice, or telegraph for that matter
posted by phooky at 7:22 PM on August 29, 2017 [1 favorite]


I mean, honestly, you want some YouTube Nazis to look at Google give me half an hour and I'll find you a dozen, easy.
posted by Artw at 7:28 PM on August 29, 2017 [3 favorites]


panic: agreed, they don't need to review everything.

Better yet: set up something like the $5 Metafilter sign-up fee, only better:

If you don't want to monetize your content, you are exempt. Continue as you are.

If you DO want to monetize your content, you agree to a TOS. In it, you click checkboxes for specific items that are, as of the date of signing, forbidden, so you can indicate whether your content is sexually suggestive, promotes drug use, or includes horrible racist slurs. At the same time, you provide a credit card and agree to pay $LargeAmountOfMoney if it turns out you were lying.

If you claim that your content has no horrible racist slurs, but then someone flags it, and a human reviewing it decides that it DOES have horrible racist slurs, you then pay $LargeAmountOfMoney, which goes to cover the cost of having humans review flagged videos.

Yeah, there would be people who wouldn't be able to pay $LargeAmountOfMoney, but surely a system like this would help. Some. At least a little.
posted by kristi at 7:32 PM on August 29, 2017 [6 favorites]


It turns out there are a bunch of firms that essentially scam artists into enforcing rights for them, and then turn around and just scan youtube for the music and report every instance as a violation, regardless of whether it's being used properly.

I remember when Youtube was just getting traction, some guy demonstrated composing an original soundtrack for his video, and having the infringement detection bots take it down at once. It wasn't even a song, it was just meandering between some chords on the piano.
posted by thelonius at 7:32 PM on August 29, 2017 [6 favorites]


Are they actually managing to block Nazis or are they, as I would expect, managing to algorithm a bunch of other content out of existence while leaving fascists untouched?

Well, I don't follow any Nazis, so I can't say whether this is affecting them or not. I know for a fact it's affecting several of the very-much-non-fascist channels I subscribe to, though.

(My personal response to this has been to make sure I'm contributing to the Patreons of the channels I watch regularly. YouTube Red is also a thing, apparently? That does monetisation without ads? But Patreon is way more direct, so I just do that.)
posted by tobascodagama at 8:15 PM on August 29, 2017 [1 favorite]


Google is a global company. It can't really have and enforce standards based upon what American liberals in 2017 consider to be beyond the pale, especially considering what a fast moving target that has become. Content that valorizes the local majority and their political/cultural ascendancy is utterly common and non-controversial in the vast majority of the world -- pretty hard to censor the alt-right in the US without creating a standard that censors ANC supporters in South Africa or ethnically Japanese people in Japan or Muslims in Egypt.
posted by MattD at 8:28 PM on August 29, 2017 [8 favorites]


(hit send too early) but of course it can't also ignore its American advertisers or regulators nor can its executives ignore their neighbors or the other parents on the lacrosse sidelines. So a broad brush gets rolled out.
posted by MattD at 8:30 PM on August 29, 2017


Or are there other companies that have hired enough people to review a billion documents?

MetaFilter. (Approx) a billion comments, (approx.) a dozen mods.
posted by sexyrobot at 8:46 PM on August 29, 2017 [3 favorites]


Would I be missing the point if I were to note that I didn't even make it to the first video without seeing an ad? Does this mean that videos about advertiser-unfriendly videos are not themselves advertiser-unfriendly? Or does it mean that advertiser-unfriendly videos are now sufficiently viral to attract advertisers?

Or am I just completely missing the point.

Because the chances are always above zero that such is the case with me.
posted by janey47 at 9:17 PM on August 29, 2017


It's perfectly fine for YouTubers to sell their own ads to run in front of, or during, their videos.

What's that - selling ad space is hard work? Huh.
posted by GuyZero at 9:48 PM on August 29, 2017 [4 favorites]


MetaFilter. (Approx) a billion comments, (approx.) a dozen mods.

My own inability to even conceive of the vast difference between one unthinkably huge number (say the number of MetaFilter comments) and another unthinkably huge number (the number of videos on YouTube) was kinda my point.
posted by straight at 9:49 PM on August 29, 2017



It's perfectly fine for YouTubers to sell their own ads to run in front of, or during, their videos.


Apologies for being totally ignorant here, is that a thing that is OK on Youtube? So you can splice in your Pepsi ad, with Pepsico paying you direct?

Seems like if the Google system is sucking there could be space for a middleman, an ad agency working direct with content creators?
posted by Meatbomb at 9:55 PM on August 29, 2017




Although I should add that per that policy creators can't insert videos that use the same ad formats that YouTube itself offers. So no prerolls but sponsored content is ok. I apparently misunderstood the policy myself until reading the help article just now, sorry.
posted by GuyZero at 10:07 PM on August 29, 2017 [2 favorites]


Yeah, there would be people who wouldn't be able to pay $LargeAmountOfMoney, but surely a system like this would help. Some. At least a little.

Just listen to the other people around here: status quo is God, status quo is God, status quo is God.
Nothing can be done, nothing should be done, nothing will be done.
Lie back. Watch the ads. Satus quo is God.
posted by happyroach at 10:42 PM on August 29, 2017 [3 favorites]


But there already is a huge screening system in place - the crowdsourced voting. Human screeners would mostly just need to focus on content with a high proportion of negative votes. I really don't understand why Google does not tap into this.

And advertising could very easily be linked to those videos only, which are generally liked (say, 95% positive votes) - that would protect the treasured brand identities and still leave the freaks to produce bad videos if they wanted to. I have yet to see a good, informative or well-made video which was overwhelmingly voted down.
posted by Laotic at 10:46 PM on August 29, 2017 [2 favorites]


I guess, but do not know, that dog whistling nazi videos gets lots of thumbs up and few negatives because who goes looking for that shit? Crowd rating data has its own selection biases.
posted by GuyZero at 11:04 PM on August 29, 2017 [7 favorites]


They could just try looking at the Nazis? Mad random suggestion I know.

Have to find them first. Then you need to determine if they are really a Nazi, or just being ironic...

What if you had to watch and tag some random 1 hour video in order to upload your video?


I only have a couple of videos I keep on there for archival purposes, but, if I did more, are you going to pay my bandwidth bills? I do have a cap...
posted by Samizdata at 11:10 PM on August 29, 2017


I guess, but do not know, that dog whistling nazi videos gets lots of thumbs up and few negatives because who goes looking for that shit? Crowd rating data has its own selection biases.

GuyZero, yes, one of the obstacles is closed communities - but I'm sure an algorithm could easily spot those and analyze who rates what. Also, as soon as a video breaks out, you'll have the broader public rating it and I trust massive statistics to do their magic.

I'm afraid the time will come when YouTube has to differentiate between viewer segments to please the advertisers, if it hasn't come yet. After all Google's adsense is all about that - not peddling mascara to an offroad entusiast.
posted by Laotic at 11:42 PM on August 29, 2017 [1 favorite]


Linked indirectly above: “YouTube De-Monetization Explained”, by The Internet Creators Guild.

(The article includes some screen caps of graphs showing ad revenue fall off that look convincing but have no values on the Y-axis, so kind of lose their punch.)
posted by Going To Maine at 12:12 AM on August 30, 2017 [2 favorites]


Have to find them first. Then you need to determine if they are really a Nazi, or just being ironic...

I don't have much of a problem with treating "ironic Nazism" (sexism, racism, etc) as "actual Nazism" (sexism, racism, etc). It's pernicious mode of discourse, and we could definitely stand to develop new ways of criticizing these issues that don't just become the things themselves.
posted by GenjiandProust at 12:41 AM on August 30, 2017 [4 favorites]


Last night I installed Lbry just so I can have a backup way to watch videos.
posted by MikeWarot at 1:45 AM on August 30, 2017


Why do I see ads on the not advertiser friendly channels | videos? I feel duped.

I thought I would go to each link and see a refreshing no-ad video each time, but NOOOO.
posted by filtergik at 2:39 AM on August 30, 2017 [2 favorites]


Is there a single article on this I can read, instead of twenty plus YouTube videos?

What? This and the recent article about pies keeps me coming back to MetaFilter! If someone linked every letter of a paragraph to separate destinations, I would probably lose it, but few sites do these multi-word as link-out article writing, while still keeping it worth reading on its own.
posted by filtergik at 2:49 AM on August 30, 2017


Have to find them first. Then you need to determine if they are really a Nazi, or just being ironic...

I don't have much of a problem with treating "ironic Nazism" (sexism, racism, etc) as "actual Nazism" (sexism, racism, etc). It's pernicious mode of discourse, and we could definitely stand to develop new ways of criticizing these issues that don't just become the things themselves.


I grok, even if I don't completely agree. I was just mocking some YouTubers and reminding Artw it's not always that easy. If it WAS, we would have a functional code tool for it.

And, I know everybody is complaining the YT algorithm is closed source, but, you have to admit that if they opened it up, there'd be 9 million ways to game it by next Tuesday. Just look at the SEO fight...
posted by Samizdata at 3:07 AM on August 30, 2017 [1 favorite]


It can't really have and enforce standards based upon what American liberals in 2017 consider to be beyond the pale, especially considering what a fast moving target that has become.

Overlooking what looks suspiciously like a dogwhistle...
Insofar as that's what large multinational advertisers demand - and it seems it is - they can't really do anything else. Their revenue stream depends on hitting acceptably close to the target of 'corporate clean' to keep the execs writing them cheques happy.
posted by Dysk at 3:22 AM on August 30, 2017 [1 favorite]


I guess, but do not know, that dog whistling nazi videos gets lots of thumbs up and few negatives because who goes looking for that shit? Crowd rating data has its own selection biases.

Just ignoring all videos with say, less than 1000 or 10,000 views would cut a significant portion of the 300 hours per minute deluge. If a video gets above a certain level of "engagement" then it gets flagged for review. Let the robots handle the constant stream of boring videos that makes up the rest.
posted by Mr.Encyclopedia at 4:09 AM on August 30, 2017 [1 favorite]


> But there already is a huge screening system in place - the crowdsourced voting. Human screeners would mostly just need to focus on content with a high proportion of negative votes. I really don't understand why Google does not tap into this.

This can only sound like a plausible solution if you've never heard of Digg nor Reddit nor the term "brigading".
posted by at by at 4:30 AM on August 30, 2017 [9 favorites]


That's why you need humans in the end of it. It doesn't matter if a video is reported sixty gajillion times if it's already been reviewed (maybe even twice or three times) and found clean, then future reports can be ignored. Hell, that gives you good data - if the same people are putting in bogus reports consistently (such as someone part of a 'report this user's videos' brigade) you can just silently shitcan all their reports in future.
posted by Dysk at 4:37 AM on August 30, 2017 [4 favorites]


That wouldn't work either. Trolls and creeps will create new burner accounts in bulk, just like they've been doing on Twitter for years.

If Google makes the account creation process onerous enough, all they've accomplished is creating a business opportunity for somebody in Asia to have a room full of people doing nothing but creating placeholder accounts to sell in lots of 1000 to Americans and Europeans.
posted by at by at 4:54 AM on August 30, 2017 [1 favorite]


Google is killing Millennials!!!!!1!!!!!!!!!!
posted by phunniemee at 6:19 AM on August 30, 2017 [1 favorite]


That wouldn't work either. Trolls and creeps will create new burner accounts in bulk, just like they've been doing on Twitter for years.

...so ignore reports from new accounts and accounts with little or no activity, let the trolls sit through a few hours of video for each burner. Slows them the hell down at least.
posted by Dysk at 7:12 AM on August 30, 2017 [6 favorites]


And obviously do all this silently - let the user enter a report as if everything was normal, and then just junk it if the account doesn't meet specs. Also, it doesn't matter how many thousands of burner accounts are reporting things if they're targeting the same content - once you've reviewed the brigading target's content and decided it's not a problem, just discard all future reports on that content (and, bonus, start discarding all reports from accounts reporting it). Like, unless the brigaders want to go find and report close to everything on the platform, with a separate burner for each, that's not really going to do much other than perhaps grow your queue a small amount.
posted by Dysk at 7:20 AM on August 30, 2017 [2 favorites]


As long as it helps wipe out the big money industry of douchebags yelling about women, trans people, and POC, then I will not lament if someone's craft invention or makeup tutorial was accidentally demonetized.
I hope it improves and the hiccups get ironed out, but fuck all the high-school level regressive clickbait assholes who want to "red pill" every tween playing minecraft.
posted by Theta States at 7:44 AM on August 30, 2017 [3 favorites]


To me it's always appeared that it's obviously possible for Google to get better at monitoring and filtering content. But the problem is that they don't really seem to have any real incentive to do so. For the most part they get the eyeballs they need, and if they demonetize content it seems to me that they just end up keeping the money they'd have paid to the creators.

Unless the creators or the advertisers leave en mass there really just doesn't seem to be a good reason for them to change. There would have to be a good alternative platform first. We're well beyond the days where "don't be evil" had even a hint of being true.
posted by cirhosis at 7:48 AM on August 30, 2017 [2 favorites]


As long as it helps wipe out the big money industry of douchebags yelling about women, trans people, and POC, then I will not lament if someone's craft invention or makeup tutorial was accidentally demonetized.

There's no evidence that these waves of demonetisation are affecting those people at all, though.
posted by tobascodagama at 8:09 AM on August 30, 2017


So, the circumstance under which Google will offer you human support is uncomplicated: it's when you pay them.
posted by Construction Concern at 9:05 PM on August 29


Although sometimes not even then. For awhile I offered my books through Apple, for which they kept 40% or so of the cover price, but you could never get a human being to explain what was happening when the platform was wonky (which happened all the time). I ended up giving up on them because their seller support was worse (if you can imagine this) than Amazon's. And Amazon is awful.
posted by joannemerriam at 8:18 AM on August 30, 2017 [1 favorite]


One thing I'm still not clear on, though it may have been explained already: If a video is demonetized, are ads still played on it?

Because if ads are still played on it, that's surely unethical.
posted by clawsoon at 9:04 AM on August 30, 2017 [2 favorites]


To me it's always appeared that it's obviously possible for Google to get better at monitoring and filtering content. But the problem is that they don't really seem to have any real incentive to do so. For the most part they get the eyeballs they need, and if they demonetize content it seems to me that they just end up keeping the money they'd have paid to the creators.
posted by cirhosis at 9:48 AM on August 30


It's my understanding that YouTube can't "keep" the money if content is demonetized. They only get money if an advertiser bids to place an ad on the video, which they keep a portion of. Without ads, the videos are only costing them money.
posted by boostergold at 10:29 AM on August 30, 2017


It's my understanding that YouTube can't "keep" the money if content is demonetized. They only get money if an advertiser bids to place an ad on the video, which they keep a portion of. Without ads, the videos are only costing them money.

You are likely right about that and I'm probably assuming more evil of them than they actually are... but still my general point stands. There just isn't an incentive for them to fix things as they stand. They have that huge stream of videos being uploaded, lots of those videos have ads and they make piles and piles of money on the current system. Why bother fixing things....

I mean besides the fact that people are starting to see that their "management" of this content is a huge joke.
posted by cirhosis at 10:53 AM on August 30, 2017 [1 favorite]


> This can only sound like a plausible solution if you've never heard of Digg nor Reddit nor the term "brigading".

FWIW, Google has an alternative tool for this: I just got an Opinion Rewards task for the first time in a long long time and it's asking me to rate Youtube recommendations. It's pretty much impossible to brigade on Opinion because the tasks are diverse and randomly assigned. They only pay out in Google Play moeny, and I'm pretty sure they limit your fiscal upside to prevent scripts / botnets from gaming the system. I'm guessing they also do some demographic normalization to avoid biased samples. Total payment for 2 'is this video recommendation appropriate' tasks: 17 cents.

TL;DR: Google has a fairly robust means of farming out human labor for content flagging, and it scales with money.
posted by pwnguin at 3:31 PM on August 30, 2017 [3 favorites]


This can only sound like a plausible solution if you've never heard of Digg nor Reddit nor the term "brigading".

Well, where there is no will, there is no way. Youtube comments are particularly toxic and vicious and the reason I believe is that there is little policy. Twitter, we discussed that recently here. Point being, once there is a firm decision, there is a way and even people here are suggesting workable solutions.
posted by Laotic at 12:39 AM on August 31, 2017 [1 favorite]


« Older You Can Pace but Can You Amble?   |   Sunvault: the first English anthology to collect... Newer »


This thread has been archived and is closed to new comments