YouTube Shifts Another Burden To Creators
November 24, 2019 9:42 AM   Subscribe

This past September, the FTC reached a $170M settlement with YouTube over their knowing and intentional violation of COPPA. In response, YouTube has implemented policies well beyond what the FTC required which are having major negative impacts on content creators, as explained by Dan Olson (a.k.a. Folding Ideas) in his video, What's Gone Wrong With the FTC's COPPA Agreement With YouTube.

COPPA (the Child Online Privacy Protection Rule) is, summarized, a law limiting data collection of children under 13 years of age. To quote the FTC:
“YouTube touted its popularity with children to prospective corporate clients,” said FTC Chairman Joe Simons. “Yet when it came to complying with COPPA, the company refused to acknowledge that portions of its platform were clearly directed to kids. There’s no excuse for YouTube’s violations of the law.”
Unfortunately, YouTube's handling after the fact has been chaotic at best, leaving content creators scrambling to understand their responsibilities, and worried about legal liabilities being passed on to them by YouTube.

For a specific example, Zee Bashew (previously on MeFi) has removed most of his animated D&D videos from the platform, stating:
Youtube basically used creators to attract 2-12 year olds and instead of making policies to avoid collecting their data.... YT used them to turn a massive profit on unauthorized data collection. So, they got fined. Massively massively fined.

...

As time goes on, and we see how this whole thing shakes out, maybe this was a huge overreaction, but I'm not somebody who takes risks on this kind of stuff.

If I were a more optimistic less anxious man, probably it would have been four or five vids set taken down but I'm a basketcase. So it's many more than that.
Folding Ideas Previously On MeFi, though likely there are others.
posted by tocts (47 comments total) 22 users marked this as a favorite
 
Man youtube is a mess
posted by Homo neanderthalensis at 10:22 AM on November 24, 2019 [7 favorites]


Well, of course if YouTube botches with full intent the implementation of a regulation they would rather ignore so their frightened users complain about the unfairness of the evil regulating agency instead of YouTube's rapacity and bad-faith implementation of their rule, that's a win for them.
posted by sukeban at 10:22 AM on November 24, 2019 [12 favorites]


I don't understand the implication here - how can YouTube know what the intended audience for a video is unless the creator clearly communicates that? Or is the implication that each YT video should be audited and tagged by YT itself?
posted by mavrik at 11:45 AM on November 24, 2019


It looks like YT is expecting creators to flag videos -- but providing no guidance on what should be flagged. Videos that are flagged make creators less money, incentivizing them to not flag stuff. And there is a LOT of content that is clearly in a gray area: Gaming stuff, toy reviews, family friendly shorts and on and on.

Basically 'if you're not paying you're the product' is running full speed into 'kids can't consent to be products.'
posted by Frayed Knot at 11:52 AM on November 24, 2019 [24 favorites]


Well, the rules are pretty open on what they can label :
In determining whether a Web site or online service, or a portion thereof, is directed to children, the Commission will consider its subject matter, visual content, use of animated characters or child-oriented activities and incentives, music or other audio content, age of models, presence of child celebrities or celebrities who appeal to children, language or other characteristics of the Web site or online service, as well as whether advertising promoting or appearing on the Web site or online service is directed to children. The Commission will also consider competent and reliable empirical evidence regarding audience composition, and evidence regarding the intended audience.
I don't know that I'm super sad that the bar to profitability on kids-youtube (which is wierd) will be slightly higher, and that the GoogleOverlord may nuke some kid-like content that doesn't self-declare, or learn to identify what's imitating kids-youtube. Unless they make a logged-in-kids-account (which parents might like the convenience of being able to review) with consent to track, they they have to go back to less invasive ad targeting. The ad revenue might not be that much worse, especially if they have a large enough sample of yes-track kids to say who is statistically watching videos even if they don't know exactly who they're serving the ad too.

Have other child-centered online presences been clobbered over cookies or is this just commission going after deepest pockets first? e.g. does Disney ignore cookie-like data on their empire?
posted by a robot made out of meat at 12:29 PM on November 24, 2019 [1 favorite]


Of course YT can't provide any guidance on what should be flagged. They don't make that decision, the FTC does.
posted by pingu at 12:30 PM on November 24, 2019


It seems like there are two options for YT here:

1. They algorithmically determine what's kid's content (obviously bad)
2. The people who make the content determine if it's kid's content

I'm not sure why option 2 is worse.
posted by No One Ever Does at 12:35 PM on November 24, 2019 [2 favorites]


If YouTube were a pure hosting service, like a white-label web host or server provider, I could see the logic behind putting COPPA liability entirely on creators. We (mostly) don't expect a simple web host to exert proactive editorial control over the content they host, in part because each site is clearly independent of the platform and could plausibly migrate to another one.

YouTube isn't a pure hosting provider, though. YouTube itself advertises on every video, and drives a lot (most?) of the traffic to every video via its recommendation algorithm. Every video you watch on YouTube is prominently branded as part of YouTube. They have explicit rules about what content is acceptable and how you're allowed to monetize your videos.

AFAICT, YouTube's claim to be a "platform" is that they are open to user-generated videos without prior permission: everything else about their behavior screams "publisher", and IMO should make them liable for the content on their website. It's possible that the review requirements needed to manage that liability would make YouTube unsustainable to operate, but that's their problem.

(None of this is legal commentary, btw. I don't have a good enough understanding of CDA 230, COPPA, and the other law here to say what the actual legal burdens are. This is just how I'd like things to work. :P)
posted by a device for making your enemy change his mind at 12:35 PM on November 24, 2019 [13 favorites]


I'm not sure why option 2 is worse.

I would prefer option 1. YouTube knows better than me whether my videos are being watched by children or not - how am I supposed to make the call? I may make videos intended for adults, but I can't control or know that children are attracted to it.

I would have preferred a hypothetical option 3, where instead of tagging videos as "for children" you could tag videos as "for adults" and let YouTube put up the fence and stop children from viewing it. I have no interest in getting children to watch my videos!

Also I'm not clear about whether the FTC have any jurisdiction over foreign users. Can they actually fine individual users from Europe for non compliance with COPRA, and how would that work? Send them an invoice for $42,000 per video? Ban them from coming to the US if they don't pay up?

As an aside, I find it funny that games with too much gun violence and gore were getting demonetized for being not family friendly enough, and now family friendly games like Minecraft and Roblox will likely get demonetized as well because they attract kids. I am rather sure I have videos in my channel that could potentially be in violation of both simultaneously!
posted by xdvesper at 1:38 PM on November 24, 2019 [5 favorites]


As someone who follows a lot of YouTube creators and have heard about this decision through them, I think it’s a shitty way of shifting responsibility into creators without the clear guidance and support they need to be successful, which seems to sum up basically all YouTube decisions.
posted by the thorn bushes have roses at 1:48 PM on November 24, 2019 [1 favorite]


There's this weird situation for years now where YouTube offers significant to life-changing amounts of money in exchange for posting videos on the internet, to the point that people have organized their lives around posting videos on the internet, that they've come to interpret receiving money in exchange for posting videos on the internet as their god-given birthright, while YouTube just sort of shrugs and is all "maybe there's money, maybe there isn't, who knows, we'll see what happens tomorrow."

But that means that given YouTube's intentionally-confusing attitude and shifting policies, any change like this is cause for a level-1 panic that the whole artifice could come crashing down tomorrow, and here we are. And then that intersects with YouTube culture, generating a hundred thousand insufferable "my inalienable right to magic internet money has been threatened" videos, which would be a lot more funny if all these people weren't being treated as toys by one of the largest companies in the world.

I do think the FTC shares a lot of blame though. The FTC's position is that individuals who upload videos to YouTube are individually responsible for compliance with COPPA:
So how does COPPA apply to channel owners who upload their content to YouTube or another third-party platform? COPPA applies in the same way it would if the channel owner had its own website or app. If a channel owner uploads content to a platform like YouTube, the channel might meet the definition of a “website or online service” covered by COPPA, depending on the nature of the content and the information collected.
How convenient for YouTube! The person who clicked the upload button is legally responsible for all the tracking and ad serving that's actually being done by YouTube. The FTC somehow believes that Google's entire advertising infrastructure is simply an agent working on behalf of the channel owner, even as the channel owner has no control over any of it.
posted by zachlipton at 1:51 PM on November 24, 2019 [18 favorites]


I would prefer option 1. YouTube knows better than me whether my videos are being watched by children or not - how am I supposed to make the call? I may make videos intended for adults, but I can't control or know that children are attracted to it.

In the past, when youtube has taken option 1, they've received a great deal of flak for it—no algorithm is going to be perfect, and it can cause a great deal of harm to channel creators if they do it incorrectly. By allowing the creator to set it, they're preventing the algorithm from being able to harm a channel. On a second look at the documentation they provide, it seems that they will use an algorithm if the value isn't set.

I would have preferred a hypothetical option 3, where instead of tagging videos as "for children" you could tag videos as "for adults" and let YouTube put up the fence and stop children from viewing it. I have no interest in getting children to watch my videos!

You can enable an 18+ age restriction on videos, at least in the new, terrible UI. Under the setting for made for kids, you can restrict the video to an adult audience.
posted by No One Ever Does at 2:00 PM on November 24, 2019


Did any of you watch the FPP video? Apparently they're doing both options 1 and 2 - allowing creators to tag videos as child-oriented, and also roving around algorithmically tagging other things as child-oriented, a tag which fucks up their monetization efforts and may even get them fined, without notice.

Anyway the ultimate issue is not whether you post child-oriented content online or not, but rather whether and how you collect personal information of people under 12. As noted by Zee Bashew and quoted below the fold:
Youtube basically used creators to attract 2-12 year olds and instead of making policies to avoid collecting their data.... YT used them to turn a massive profit on unauthorized data collection.
So basically YouTube could have stopped collecting personal data on videos where it reasonably believed, by express notice or algorithmic detection, that the target audience was 12 and under. Instead they do the algorithmic detection, but instead of just obeying the rule they kick it back to the creator and say "whoa there! You better do something!"

... on preview I see zachlipton has noted the same issue (at least partially). We can all stipulate that the intent of COPPA is right and good, and still point out that YouTube's response is fucked up.
posted by Joey Buttafoucault at 2:00 PM on November 24, 2019 [5 favorites]


Would YouTube potentially be able to do something like Firefox does with the Pocket content on its home screen? The displayed content is personalized, but the logic is on the client side so it doesn't require submitting personal information to Mozilla.

This of course ignores the fact that YouTube wants this data. Even if the Pocket approach would be good enough to sustain YouTube (and I don't know if it would), it's likely it would be less profitable than being able to collect and analyze your data on the server side. Given a choice between less profit and more profit, it's obvious what a for-profit company is going to do.

But just because they want more profit doesn't mean they have a right to it above all other concerns. Fines are supposed to rebalance that equation: make it unprofitable to do things that we consider harmful (e.g. collecting personal data about children).

YouTube is instead passing the burden onto individual creators on the platform, which is ingenious and pretty evil, because the policies and fines are calibrated for businesses doing direct data collection and directly profiting off of it in some way. They're an outsized punishment for a one-person shop doing Let's Play Mario Maker, who doesn't collect the data themselves and only indirectly and marginally profits off of it.

It makes the original policy seem overly burdensome and cruel, even though it was a pretty reasonable request of a business with YouTube's resources.
posted by Riki tiki at 2:01 PM on November 24, 2019 [3 favorites]


It's interesting to compare this with the recent CGPGrey video about being suspended from YouTube (spoilers: he was flagged for impersonating... CGPGrey.)

If YouTube's automated systems are going to pull crap like that, channels are going to get videos flagged that should not be flagged, and the appeal process will not be easy.
posted by SansPoint at 2:19 PM on November 24, 2019 [4 favorites]


How convenient for YouTube! The person who clicked the upload button is legally responsible for all the tracking and ad serving that's actually being done by YouTube. The FTC somehow believes that Google's entire advertising infrastructure is simply an agent working on behalf of the channel owner, even as the channel owner has no control over any of it.

I looked up the text of COPPA to see how it defines an "operator," someone who's responsible (and liable) for complying with the rule.
Operator means any person who operates a Web site located on the Internet or an online service and who collects or maintains personal information from or about the users of or visitors to such Web site or online service, or on whose behalf such information is collected or maintained... Personal information is collected or maintained on behalf of an operator when:
  1. It is collected or maintained by an agent or service provider of the operator; or
  2. The operator benefits by allowing another person to collect personal information directly from users of such Web site or online service.
YouTube's position appears to be that channel owners are considered operators under COPPA because they benefit from data collection via higher advertising payouts and traffic from the recommendation system. I can see how they arrived there by a lawyerly reading of the rule. The FTC could solve this with an amendment exempting ordinary users who have no authority over data-collection practices from operator status. Or YouTube could simply handle traffic from users who aren't logged in according to COPPA rules. Either way works.

Also, the FTC is currently accepting comments from the public for potential revisions to COPPA.

Let me post that link again: if anyone wants to tell the FTC they need to fix COPPA, you have until December 9.
posted by skymt at 2:20 PM on November 24, 2019 [3 favorites]


It's particularly annoying for those of us outside the USA, as we're still being threatened with COPPA compliance and yet there's nothing we can do to influence the COPPA process.
posted by scruss at 2:51 PM on November 24, 2019


we're still being threatened with COPPA compliance and yet there's nothing we can do to influence the COPPA process.

When the EU implemented its data-privacy/cookie regulations, US Web sites suddenly had to deal with something they likely didn't know was coming or what to do about it. Some news sites reacted by blocking users coming in from European IP addresses. Operators of some smaller sites (raises hand) tried at first to comply, got tired of complaints from their majority-American audiences and just disabled those cookie-warning popups.
posted by adamg at 5:37 PM on November 24, 2019


This Game Theory video explains the categories of video deemed as "for children" - which potentially includes anything that is animated and anything referring to characters or celebs liked by kids even if the content itself isn't directed at kids (eg analysis videos of cartoons like Steven Universe).
posted by divabat at 5:56 PM on November 24, 2019


As long as people have trouble understanding the difference between "something kids might watch" and "created to appeal to children," they will be upset about having to identify whether their videos are targeted at children.
posted by wierdo at 5:57 PM on November 24, 2019 [1 favorite]


I mean, in this case it doesn't matter because the distinction between the two is meaningless for COPPA compliance - whether your videos are for kids or for all ages, you can't collect information on children 12 or under without their parents' explicit consent.
posted by Merus at 6:45 PM on November 24, 2019


What Joey Buttafoucault said. YouTube isn't being forced to decide between Option 1 (algorithmic identification) and Option 2 (creators self-identifying content as kid-directed or not). The FTC only mandates the second option, and while one dissenting opinion from the board says there should be a technological backstop to prevent abuse of self-identification (basically Option 1), the majority opinion notes the high possibility of undesirable outcomes as a result of such a system.

The money quote from the video, which is similar to the sentiment from Zee Bashew posted above:
While YouTube could simply remove child-directed advertising from their behavioral ad model entirely or use their clearly demonstrated ability to discern under-13 viewers and disable behavioral advertising selectively at the user level, they have instead chosen to structure their system to place the burden and consequences overwhelmingly on channel operators. YouTube is, in effect, trying to build a scenario where they are still allowed to gather and utilize children's data in serving targeted, behavioral ads with the blame falling on channel owners for allowing them to gather that information and serve those ads.
The Folding Ideas video goes on to explain that channel operators don't even have the tools in many cases to turn off child-directed ads on their channels themselves, which would theoretically allow channels to opt out completely of this whole mess. The basic Adsense controls don't allow you to turn off targeting to a certain age group. Your only option in that case is to turn off behaviorally targeted advertising entirely, which results in a direct hit to revenue.

Which isn't to say that you shouldn't do it anyways; I think behaviorally targeted advertising is a potential ethical nightmare and the fact that it's so tightly woven into the fabric of the modern internet bothers me, even as I make a living off of it (as millions of people do every day). But you can see the incredibly obvious reason why this would be a non-starter for a lot of channels.

If YouTube is hoping that this operational confusion will lead to YouTubers railing against regulatory bodies that have suddenly cut into their revenue or made their business more unstable, I think this will backfire. Instead, you'll probably just find that more YouTubers will abandon the platform or scale back their activities to stuff that they know will 100% be safe under the new regime, and then hope that the system doesn't change again in the future. And because this is a tech platform we're talking about, it will change again, and eventually people will weigh the stability of the platform's policies itself when deciding whether this is the sort of thing they want to do for a living.
posted by chrominance at 6:50 PM on November 24, 2019 [4 favorites]


I'm a little disappointed that so many are so easily pooling into the easy side-channel distraction over what IS kid's content after all? when the main issue is elsewhere. At the risk of speaking Twitter on MeFi:
IT'S THE DATA COLLECTION, NOT THE CONTENT OR ALGORITHMS
IT'S THE DATA COLLECTION, NOT THE CONTENT OR ALGORITHMS
IT'S THE DATA COLLECTION, NOT THE CONTENT OR ALGORITHMS
posted by bartleby at 8:22 PM on November 24, 2019 [12 favorites]


Actually, I'm curious: Google works out a viewer is under 13 by looking at what they watch. How do they do that if they're not supposed to be collecting data on them? Does turning off data collection on account identified as under-13s means that Google loses the ability to identify under-13s?
posted by Merus at 8:52 PM on November 24, 2019 [1 favorite]


Semi-related gripe: Roblox. I have gotten multiple emails from Roblox asking me to approve my kid’s account. (My kid isn’t on Roblox at all, but my email address is a Gmail that is apparently a magnet for misdirected mail.)

Every single time I have denied the account, stating (quite honestly) that I did not give my child permission to set up an account. (I don’t want my kid on there, and I don’t want someone else’s kid tied to my email address!)

After digging a little, it turns out that (1) they collect the full name and birthdate of the kid during account setup, among other demographic data, and (b) don’t ask for parental permission until AFTER the kid’s account is active and in use. When I complained to them that this seems to be a clear violation of COPPA - e.g. if they need age they can simply ask if the kid is under age X, they don’t need an exact age, ESPECIALLY not without parent consent first - and the low-level tech support person assured me that this was totally fine because they don’t share the data.

My immediate response was “I don’t trust you, or anyone else, to properly secure any data whatsoever, just given the track record of security and data breaches at so many websites, please blacklist my email address and ban anyone from using it with any service associated with your company.”

Not a week after the last time this happened, a friend was using her computer at a meeting and was hit with multiple pop ups from Roblox. She had let her kid try it, the software was set by default to auto-launch every time the computer booted up, it was difficult to make it stop and even harder for her to just kill it and uninstall it so her computer was usable. Yep. Clearly a company I can trust with my kid’s demographic info.

Having said all that: I applaud YouTube for taking any steps to keeping kids and their data safer online, but rather than put the onus on content producers, they should actually take responsibility. Update TOS with clear rules, ban accounts that don’t follow them, set up a feedback tool for end users to report problems, and use some sort of AI to flag suspect videos. You know, like they do for the RIAA/MPAA. They have the tech to do it, and Google has deep enough pockets to fund it. They know damn well there is a problem here. It isn’t our job to fix their failure to do some minimal amount of screening.
posted by caution live frogs at 5:57 AM on November 25, 2019 [1 favorite]


My 12 yo son has had a Google account through his school for several years. I know he looks at YT at school and at home when on his school-issued Chromebook.

It seems to me that YT is forcing an either/or situation on content creators when it is entirely within their means to offer a both/and solution.

I wonder how Twitch handles this issue?
posted by Big Al 8000 at 9:32 AM on November 25, 2019


I'm not sure how exactly Google's bots are supposed to infer intent? Even Google's most..optimistic..marketing has yet to claim their magic AI is able to do that. And yes, under current law and FTC rules, intent matters here. If your video is intended to appeal to children it must be handled differently by the platform than videos not intended for children.

Well, YouTube could disable comments, cookie tracking, and all that by default on all videos, but it's fairly obvious why they aren't going to do that. Even if they were willing to take the hit to ad rates, YouTubers would (quite understandably) flip their shit over the loss of income. Literally anything they do here will result in people shouting at them.

Anyone know what the earning rates are for views by YT Premium subscribers is these days? At one point it was said to be higher than the rates for views with ads, so I'm wondering if that remains true.
posted by wierdo at 10:53 AM on November 25, 2019


Just got my channel across the monetization threshold and now this. I’ve got no idea what to do.
posted by interogative mood at 11:51 AM on November 25, 2019


Literally anything they do here will result in people shouting at them.

There's a strong argument that this particular requirement was foreseeable and that YT has been inviting publishers to play regulatory arbitrage up until now, however.
posted by PMdixon at 11:55 AM on November 25, 2019 [1 favorite]


generating a hundred thousand insufferable "my inalienable right to magic internet money has been threatened" videos

I just want to point out that this is a really amazingly uncharitable take on YouTube content creators (note: I am not a content creator).

We don't refer to the money paid to e.g. TV/film writers or producers as "magic movie money" as if they should be happy to work for free and take whatever pennies make it to them. Producing video or audio content from which you then derive your living selling it directly or via advertisement-driven compensation is a legitimate profession, though like all creative professions there are amateurs interested to get their stuff out there too. Talking about creatives on YouTube as entitled brats, literally calling them "insufferable" for thinking they should be paid for their work, is super fucking gross.
posted by tocts at 11:58 AM on November 25, 2019 [10 favorites]


Just got my channel across the monetization threshold and now this. I’ve got no idea what to do.

If you made a video for preteens with them as your intended audience, check the box. If you neither intentionally created the video to appeal to children and through some confluence of events it didn't end up being a video that looks almost exactly like one that is intended for children (the FTC's fact sheet has examples if you aren't clear about what children's TV looks like), you can confidently mark it as not being intended for children.

That's it, in terms of what YouTube is asking you to do.
posted by wierdo at 12:10 PM on November 25, 2019


I do think creative professionals should get paid for their work and certainly deserve to be treated a hell of a lot better than they are by YouTube. But I don't think we've quite grappled with the shift from "getting paid to make video is a thing that happens after actual humans have made a decision to hire you and you've negotiated a contract for your work" to "YouTube is a capricious beast that may or may not deliver life-changing amounts of money in exchange for video sight-unseen." The set of incentives that system produces, the kinds of content it rewards and punishes, the ways creators can be boosted or shunned by an algorithm, these are not inherently good things. The world all of a sudden has a massive new Disney Channel or Nickelodeon that pays producers for whatever captivates the eyeballs of children without even looking at it first or regard for quality or safety.

That's what I mean by "magic internet money." Nobody's negotiated some fair agreement to exchange money for creative labor here; it's just money spewing forth from a black box fountain, and everyone is terrified of anything that might make it stop. And I don't blame them; creators have upended their lives, quit their jobs or dropped out of school to make a living from a company they can't even call on the phone based on a vague arrangement that uploading video leads to unspecified amounts of money. But there are creators on YouTube who are suddenly worried that they don't create videos for children yet a lot of their audience seem to be young children and they're not sure how this will impact them. And I sympathize with, say, model train enthusiasts or toy collectors where there's ambiguity, but for the PewDiePies or MrBeasts of the world, I don't know, maybe some creators have to take some responsibility for their audience of children themselves? If millions of kids under 13 are really watching your videos and idolizing you, isn't it at least a little bit on you to act in accordance with that fact, legally and morally?

Google is a $900B corporation; it should pay people for the content on which it makes its money. But somehow in the last few years, making a living by uploading videos of yourself eating breakfast has gone from a laughable idea to a reality for some people. Some creators have taken the approach that they've entered into a horribly one-sided business relationship with Google and want to join together to negotiate a fairer deal. That is, after all, what TV/film writers and producers have done for decades, and are doing right now, to fight for what's theirs.
posted by zachlipton at 2:54 PM on November 25, 2019 [6 favorites]


I do think creative professionals should get paid for their work

...

making a living by uploading videos of yourself eating breakfast has gone from a laughable idea to a reality for some people


Yeah, no. It's rather hard to take seriously your claim of sympathy for creators when your concept of what people are largely doing on YouTube is what amounts to an Instagram joke from 5 years ago.
posted by tocts at 4:24 PM on November 25, 2019


There are many absolutely incredible creators on YouTube who are not making magical internet money. They are working incredibly hard to create compelling content that is way more than just some kind of Casey Neistat style vlog (though I do think Casey has some interesting content). There are makers who might create something like, say, a Minecraft-themed project that might appeal to children yet nonetheless is being made by adults for adults. Should they have to pay $40k+ for violating COPPA in an ambiguous case like this when they weren't given any guidance on whether they need to demonetize a video like this? It's the smaller creators who are doing the best they can and working really hard more than the mega-popular bros who are being screwed over here. I don't think it's fair to equate PewDiePie with those smaller maker channels.

I really strongly back up what tocts is saying here: Producing video or audio content from which you then derive your living selling it directly or via advertisement-driven compensation is a legitimate profession. YouTube depends on these makers to survive and yet consistently makes decisions that throws them under the bus.

One of my favorite channels, Evan and Katelyn, the channel I was thinking about with that Minecraft example, talked at length on their podcast (non-YouTube ways to listen are at the bottom) about their worries over this decision. It's also interesting to hear about YouTube's P-Score during the same episode.
posted by the thorn bushes have roses at 4:42 PM on November 25, 2019 [1 favorite]


There seems to be a belief that there's some obvious, straightforward solution to this problem that Google is just refusing to implement out of spite or laziness. But I don't think there is.

Take the idea that uploaders should be able to classify their videos as not-for-children to protect them from the algorithm, and then Google should just not show them to children (or not collect data if it does). But how does Google do that? A lot of children watch youtube on shared family devices without logging in, or while logged in as some other member of the family, and while Google may have a good enough idea who's watching to be able to sell advertising, that doesn't mean that they have enough confidence to convince the regulator that they aren't collecting data on children when they collect data on the household in general. Especially because uploaders have an obvious incentive to game the system.

Judging videos by their content is probably the least-worst compromise solution, and it seems to be consistent with the regulator's general approach. On a Youtube scale that, unfortunately, means algorithms.

There's also the idea that Google is somehow shifting responsibility for compliance to uploaders, when what seems to actually have happened is that Google and uploaders have always been independently responsible for compliance, and now that Google has been forced to comply themselves they've started reminding uploaders of their own responsibilities. Google can't give legal advice about what a video for children is, because that's a matter for the regulator and if the regulator decides to start suing people Google's advice will not protect them. And they can't reveal how their algorithm works without telling hundreds of thousands of uploaders how to make videos that are attractive to the vast audience of children but aren't caught. So here we are.
posted by A Thousand Baited Hooks at 5:23 PM on November 25, 2019 [1 favorite]


Producing video or audio content from which you then derive your living selling it directly or via advertisement-driven compensation is a legitimate profession

I suppose. Or you could view much of it like the internet equivalent of busking, except you have to work the same corner every day and at any point the cops might chase you off.

But I'm old, cranky, and apparently unpopular, at least based on my adsense earnings.

Anybody got figures on how many people are actually "making a living" at this? I always figured it was a handful of outliers making more than a couple grand a year.

Of course, none of this is particularly germane to the topic at hand, which is that Alphabet continues to try to avoid any kind of regulation by whatever means possible, while gobbling up and data-farming what's rapidly become the lion's share of content creation online.
posted by aspersioncast at 5:25 PM on November 25, 2019 [1 favorite]


The number of people making a living is probably larger than you expect; but their earnings are meager. That’s the channels with 50k-100k subscribers and enough of a community to get some patreon and other support. Mostly those folks are moonlighting though and the money supplements their primary income.
posted by interogative mood at 6:29 PM on November 25, 2019 [1 favorite]


Yeah, no. It's rather hard to take seriously your claim of sympathy for creators when your concept of what people are largely doing on YouTube is what amounts to a Instagram joke from 5 years ago.

I'm not sure I understand what point you're trying to make besides upholding the honor and hard work of YouTube creators. YouTube has, in the space of a few years, introduced a model for compensating people for creative work that has attributes that seem to me to be unique in the history of paying for art. That's produced fantastic results from people who have poured their lives into their work: amazing science and educational videos; singers and performers; tutorials on how to do or make just about anything, from people who didn't have to convince the usual set of biased gatekeepers to let them create. It's produced a lot of what I'll be open about dismissively calling content for the sake of content: "24 Hours I Only Ate Foods The WRONG WAY!" and "25 FOODS YOU'RE EATING WRONG," which come from a mix of individuals trying to build their own channels and content farms churning out a dozen "top N X's that Y!" videos a week, which are fine and often fun and surely work for the people making it, and also often generate more views than many TV shows are are the kind of thing where readers of Neil Postman might ask whether it's a net good for society to have an economic system that encourages the seemingly endless production of such videos. And it's produced some questionable outcomes: massively popular stars who have built up armies of children who idolize them yet are fairly unconstrained by the commercial realities that normally prevent such stars from using, say, homophobic epithets; family vloggers who have turned their children's lives into reality shows in ways that are sometimes uncomfortable; algorithmically generated weird children's programming. Just like display advertising on the web and the rise and fall in AdSense rates, YouTube's model has produced some strange incentives, some fantastic and some pretty dark, and I don't think it's unreasonable to be somewhat critical of what those are.

I've been clear that this is mainly on YouTube. They led creators to believe they could profit off this system, to the extent a number of them organized their lives around it, and now are leaving them holding the bag with unclear guidance. That's been how YouTube has always operated, as opaquely and mysteriously as possible. I entirely agree that producing video or audio content from which you then derive your living selling it directly or via advertisement-driven compensation is a legitimate profession; my point is that nothing about the way YouTube works gives the impression that they're treating it as a legitimate profession; they want it to just be a black box that may or may not produce a 55% split of something under unspecified conditions when people have organized their lives around this work. And when the obvious problems with this model of paying creative people for their work become apparent, YouTube's answer is to add more algorithms and more vague policies, making the problem worse. Nothing about the way YouTube operates the system screams "legitimate profession" the same way a WGA member getting paid to write for television does; it looks a lot more like busking.

Creators on YouTube have thrown themselves into a fickle advertising market over which they have no control, often due to lack of any other option, and much like what's happened with display ads (the bust of which caused harm to content mills and amazing writers alike), it's not clear to me that the future market for paying people to produce Minecraft videos sight unseen is looking incredibly bright. We've jumped right to worrying about whether a Minecraft video creator's earnings will decrease without stopping to ponder the system under which people are getting paid by an algorithm to produce Minecraft videos in the first place and whether that's sustainable.

I guess there's two ways to look at it. One is that a lot of creative and talented people deserve to get paid for their amazing work that wouldn't exist under any other system and are locked in a fight with very little leverage against one of the world's largest corporations for fair treatment. The other is that YouTube can be a ruthless amoral machine for converting eyeballs into dollars in ways that has no real regard for humanity and can cause societal harm. And I think both are true.
posted by zachlipton at 7:09 PM on November 25, 2019 [2 favorites]


I'm not sure I understand what point you're trying to make besides upholding the honor and hard work of YouTube creators.

The point I'm making is that you are being super dismissive of the impact this is going to have on real people by using the most condescending examples possible, and that despite you claiming to give a shit about creators you clearly would rather steer the conversation towards "oh well but really the whole thing should be torn down anyways".

The reality is that what Google is doing here isn't going to hurt PewDiePie -- he's got enough followers that he can make whatever money he wants with direct promotional support from companies (which totally sidesteps COPPA, being un-targeted), patreon or patreon-like funding, etc. It's also not going to affect people making $5 a month posting randomly recorded Minecraft videos -- they aren't putting a lot of effort in, don't need the money, and aren't going to miss it if they end up algorithmically stuck into the "for kids" bucket, even if they didn't intend to be there.

The people who are going to be affected by this are those who are doing this for a living -- whether as a second job making up a part of the rent, or as a full time job. These are people who are putting in hard work to produce good content and who have been working under a good faith assumption that YouTube was already doing the right thing by COPPA, and now they're being told that actually no, YouTube was breaking the law this whole time in a way they had no visibility into and by the way now it's gonna be put on them to deal with it. And your comments, which amount to "well yeah but YouTube has always been bad to creators so let's focus on destroying the system instead and who cares about what happens to creators", frankly suck.
posted by tocts at 3:41 AM on November 26, 2019 [1 favorite]


who have been working under a good faith assumption that YouTube was already doing the right thing by COPPA

Really? I kinda doubt most of them had any idea what COPPA is on the basis that most Americans don't know what it is. I really don't think it's appropriate to basically suggest that people publishing on Youtu have been very concerned about fulfilling their regulatory mandates this whole time but they were relying on a good faith belief in some assurance by YouTube that they were taken care of. Is there some other reading I should be applying here?
posted by PMdixon at 4:51 AM on November 26, 2019


My position is more "well yeah but YouTube has always been bad to creators so let's focus on destroying the system instead and who cares about what happens to finding a better way to fund creators" (that hopefully involves slightly fewer perverse incentives).

Although I dunno, frankly I do feel kinda dismissive about it. I love how easy it is to find tutorials on YouTube, but I'm really not sure steering vast amounts of labor and electricity into server farms that are largely providing global distribution for people's phone videos (and apparently helping to steer budding MRA/conspiracy theorist-types toward content) is exactly something to strive for or defend.

I really am sorry that people are affected by this, but as someone who already lost a little revenue stream to the death of the music industry over a decade ago, I'd have to say that there was plenty of warning. "Creatives" who put all their eggs in this basket were ill-advised, and people who are making supplementary income this way can hopefully afford to find different platforms. If there were a mass exodus from YouTube, maybe the company would actually change its behavior.

Again, I know that sounds harsh, but none of this is meant to be a referendum on whether making video content for a social media platform is a worthwhile endeavor. YouTube's bad behavior is going to hurt people, and the company will have to be forced to reckon with it.
posted by aspersioncast at 5:27 AM on November 26, 2019 [1 favorite]


Really? I kinda doubt most of them had any idea what COPPA is on the basis that most Americans don't know what it is.

For the people paying attention, those who actually looked at what information was available to them from YouTube's ad platform, the only data about audience demographics they were given was for people 13 and older (e.g. not subject to COPPA). Additionally, YouTube officially claimed people had to be 13+ to have an account. So, for the individual content creators, there was literally no information at their disposal that would tell them that actually, no, YouTube was tracking kids under 13 years old in violation of COPPA, supposedly at their behest but without telling them or giving them any way to say "no seriously, don't do that".
posted by tocts at 5:54 AM on November 26, 2019 [3 favorites]


YouTube's bad behavior is going to hurt people, and the company will have to be forced to reckon with it.

How? Being part of the Google kraken means that you can't force their hand financially, they've done everything they can to cut off legal remedies, and as we see over and over, their response to regulation is to whip up FUD so that the very people that they're hurting defend them.

This is the whole problem with Silicon Valley as a whole - they have turned "socialize harm, privatize profit" into an ethos.
posted by NoxAeternum at 7:16 AM on November 26, 2019 [3 favorites]


So, for the individual content creators, there was literally no information at their disposal that would tell them that actually, no, YouTube was tracking kids under 13 years old in violation of COPPA, supposedly at their behest but without telling them or giving them any way to say "no seriously, don't do that".

And now that Google got their hand caught in the data cookie jar, their response is to blame creators, saying "why didn't you stop me!?"
posted by NoxAeternum at 7:18 AM on November 26, 2019 [2 favorites]


the company will have to be forced to reckon with it.

How?


As in, the only way they'll reckon with it is under duress.
posted by aspersioncast at 1:53 PM on November 26, 2019


people who are making supplementary income this way can hopefully afford to find different platforms.

A lot of people turn to YouTube specifically because no other platforms exist. Even regular jobs are hard to come by, especially if you're disabled or a minority.
posted by divabat at 2:40 PM on November 26, 2019


This is also why Alphabet needs to be broken up - because they have funding from Alphabet, YouTube has been able to beat out their opponents easily,becoming pretty much the only game in town. Without that flow of cash, not only would YouTube be more dependent on their creators, but space would open up for competition.
posted by NoxAeternum at 3:57 PM on November 26, 2019 [2 favorites]


« Older It's beginning to look at lot like Netflix   |   Manami Ito Newer »


This thread has been archived and is closed to new comments