"..if you have small children, keep them the hell away from YouTube."
July 13, 2018 11:53 AM   Subscribe

Thank you for sharing this. My daughter is 1.5, and so far only watching a handful of things on Netflix, and generally while supervised. Things like this video, and some great discussions on Ask Metafilter, are a potent reminder to stay involved and knowledgeable about what she's consuming.
posted by Monster_Zero at 12:17 PM on July 13, 2018 [2 favorites]

I got rid of my TV when I was pregnant and when my son becomes aware enough for this to be an issue, I plan on literally blocking Youtube on my wifi router. I will then spend the next 18 years shrugging about how strange it is that Youtube doesn't seem to work on our devices.
posted by If only I had a penguin... at 12:36 PM on July 13, 2018 [13 favorites]

I plan on literally blocking Youtube on my wifi router.

This is unfortunate, because there is a lot of excellent content on Youtube (educational, music and art instruction, science, etc), but, if I didn't have a filtering solution that I could rely on, I could see blocking at the router.
posted by thelonius at 12:41 PM on July 13, 2018 [17 favorites]

I've done exactly that a couple times. Youtube is surprisingly hard to block effectively at the router level, because the app directly connects to several different CDN domains that you never see in the web interface. It's not impossible, but it's a lot trickier than just blocking youtube.com

At this point, my policy is to raise my kids to be questioning and discerning enough about these videos and the dopamine cycle he references early on, because it's not going away any time soon. I'd rather my kids have experience with and know how to recognize these manipulative systems. It's like the computer equivalent to sex education. I want my kids to understand the benefits and risks involved and go in prepared rather than pretend abstinence is the right answer.
posted by Mr.Encyclopedia at 12:43 PM on July 13, 2018 [22 favorites]

posted by We had a deal, Kyle at 12:53 PM on July 13, 2018 [19 favorites]

If you want to go down a rabbit hole of strangeness, just google Spiderman+Elsa. Deeply strange.
posted by theorique at 12:54 PM on July 13, 2018 [2 favorites]

It's not impossible, but it's a lot trickier than just blocking youtube.com

Does that mean blocking youtube.com won't keep a person off the site? Because in the case of filtering children's content, blocking the domain should be enough; you're probably not concerned about whether they're going to find YouTube vids on Tumblr (which you are presumably also blocking, omg, so much porn, so much gore, so many memes that you don't want to explain to a child).

Blocking youtube.com should keep them from browsing YouTube; being able to access YouTube-hosted TED Talks elsewhere isn't the problem.
posted by ErisLordFreedom at 12:56 PM on July 13, 2018 [1 favorite]

For me, the key point of that was in the 2nd half, when he started talking about how we are now throwing "big data" at things like predictive policing, with no understanding of how years of prejudice might be encoded into that data and thus impacting the outputs.
posted by COD at 12:59 PM on July 13, 2018 [8 favorites]

Isn't the solution to be aware of the media your kids are consuming and set limits on what they can watch? We don't have cable, but we do have Internet, and YouTube is how our kids consume visual entertainment. When they were younger I kept tabs on what they were watching at all times, and there were also time limits when viewing YouTube. We never had a problem.

I think what's annoying and increasingly alarming to me is how YouTube cheerfully suggests far-right viral content in my own feed. It's really weird, almost as though all of the big social media companies have some sort of white nationalist DNA that must express itself in their algorithms.

In this media environment, however, as a parent I'm not sure if it's enough to "cut the cord" entirely on YouTube and that sort of thing. It's unrealistic. The solution is communication and connection with one's own kids as a parent. That's all any parent can ever do. Plus engage with the issues and commend platforms like YouTube when the inadvertently do something right.
posted by JamesBay at 1:02 PM on July 13, 2018 [3 favorites]

My daughter is mostly interested in those toy opening/playing videos and the autoplay has never taken her to anything I'd consider sketchy, that I've seen. The toy stuff is a bit annoying (capitalism, rargh), but really no worse than the Saturday morning cartoons + commercials I used to watch when I was her age. And she's definitely not watching it for "hours and hours" though I'm sure some kids are. And that's really sad, but not really because of what they're watching.

And once again, this impossibility of figuring out who's making this stuff -- like, this is a bot? Is this a person? Is this a troll? What does it mean that we can't tell the difference between these things anymore?

Haven't people, you know, gone and done the work on this? I feel like the answer is maybe in one of our previous FPPs.
posted by ODiV at 1:02 PM on July 13, 2018 [3 favorites]

Machine-learning algorithms are difficult to understand when one is a grown-ass adult. Children who are young enough to be hypnotized by surprise egg or toy reveal videos are definitely not capable of understanding it. And media literacy is a fine thing; I agree with people who want their children to experience the internet and be taught how to navigate it carefully; but preschoolers cannot be media literate.
posted by Hypatia at 1:11 PM on July 13, 2018 [9 favorites]

It's really weird, almost as though all of the big social media companies have some sort of white nationalist DNA that must express itself in their algorithms.
It's not weird: this is exactly what is going on. Techbro culture is white supremacy culture. The book Algorithms of Oppression: How Search Engines Reinforce Racism is a great and accessible text on exactly this.
posted by sockermom at 1:11 PM on July 13, 2018 [26 favorites]

Huh....thanks for the tip on difficulty of blocking YouTube. I see the point about teaching kids to be discerning, but I see this more like "not buying him that toy." Youtube is a toy I don't intend to buy my baby, just like I don't intend to buy him toy guns or a trampoline or have him play football.

I don't feel like it makes sense to start from the default of "he has youtube" but instead think "he doesn't have youtube (Cause he's too little), would adding youtube to his life make it better or worse." I vote worse. There may be educational programming on there, I suppose, but there's nothing he can learn on Youtube that he can only learn on youtube and probably not much that he can learn on youtube better than he can learn other ways. At least in his younger years. Maybe there will be some hobby he takes up when he's older where he can access tutorials or something and youtube will somehow start working again before he turns 18.
posted by If only I had a penguin... at 1:19 PM on July 13, 2018 [3 favorites]

For those of you who prefer writing to video, Bridle published Something is wrong on the internet back in Nov 2017 on the same theme. Also some previous Metafilter discussion centered on Brian Koerber's writeup of these horror videos.
posted by Nelson at 1:28 PM on July 13, 2018 [9 favorites]

And media literacy is a fine thing; I agree with people who want their children to experience the internet and be taught how to navigate it carefully; but preschoolers cannot be media literate.

The last few years make a pretty good case for almost no one being media literate, honestly.
posted by ODiV at 1:28 PM on July 13, 2018 [22 favorites]

When my 9yo was younger I generally selected his YouTube entertainment for him and did not give him access to a tablet.
posted by JamesBay at 1:32 PM on July 13, 2018 [2 favorites]

This doesn't seem to be a problem for us with Tiny Croft. She shows no interest in YouTube or most other children's content. Basically, as far as she's concerned, if it isn't Bubble Guppies, it isn't shit.

I can recite most Bubble Guppies episodes by heart at this point, and can expound at considerable length on the logical inconsistencies in their presentation of life under water.

They have a Fire Department...
posted by Naberius at 1:37 PM on July 13, 2018 [28 favorites]

FWIW, I know a family whose homeschooled teenaged daughter is allowed to arrange her life online, but not allowed to live it online, so her access to the Internet has been restricted on a coarse grained basis all her life. Her parents manage to do it because their own lives are more luddite than what they demand of her, so no hypocrisy there. And since they're not really poking their noses when she has hers glued to a device, and not censoring by content, there's not much resentment (that I've seen). She seems to be doing fine.
posted by ocschwar at 1:44 PM on July 13, 2018

Dan Olson Did a bit of a deep dive on the Elsa/Superman end of things which gets into Google’s probable culpability and the degree to which some of these channels weaponize DCMAs.

The use of algorithms and perception of them as “objective” is what scares me the most. Like mis-uses of science, there is an undercurrent of devaluing other people, their individuality, and their relationships which seems to be growing worse because of an emphasis on the value of objectivity and the dismissal of everything else. It seems to be at the core of “new atheism” and “gamer” culture, and a whole host of other white male dominated cultural movements which trend toward fascism. They aren’t objective, of course, but they believe they are and it serves as a disturbingly effective smokescreen even in theoretically Left movements.
posted by Deoridhe at 1:52 PM on July 13, 2018 [16 favorites]

Having just watched the Mr. Rogers documentary, I really, really wish he were still around. His entire life's work was about making sure television was good and helpful for the children. He was enraged, absolutely livid, about the state of television, and how he felt it was exploiting and misleading young children.

He would have something useful to say about these sorts of YouTube videos, which are far worse than bad children's cartoons ever were. And people at YouTube would actually do something, I bet, if Mr. Rogers told them they were doing a bad job and treating children badly.

Right now, all they care about is maximum time on device, and everything about their algorithms and design works to achieve that goal with no thought or concern for other consequences. Some individual employees are starting to be aware of problems but it doesn't seem to matter all that much.

I don't know, prescriptively, what the solution should be. But morally speaking, if they can put tons of optimization into informing people about and then selling YouTube Red, they can do the same to inform parents about parental control options and nudge them into enabling them. If they can identify complex demographic categories to sell ads, then they should be able to tell when a user may be a small child, and start showing them safer, non-exploitative videos.
posted by vogon_poet at 2:10 PM on July 13, 2018 [11 favorites]

you're going to have underpaid, precarious contract workers without proper mental health support being damaged by it as well.

my first technology job was searching user generated content that violated company policies and federal laws for www.excite.com/communities

I saw a lot of very disturbing and traumatic stuff, a lot of it that required contacting the FBI and collecting IP's and getting people sent to prison for a very long time.

I've looked into the eyes of people suffering online being forced to perform acts that made me want to lay down and die.

It fucked me up and broke something inside me. I don't know that I'm over it 20 years later.
posted by nikaspark at 2:15 PM on July 13, 2018 [74 favorites]

The whole genre of toy-unboxing-videos-for-kids is super creepy imo. Gives me the willies and makes me despair for the future of humanity at the same time.
posted by Anticipation Of A New Lover's Arrival, The at 2:50 PM on July 13, 2018 [12 favorites]

I cut out at 9:00, even at 1.5x speed. Maybe that made it worse because it made the patterns of the talk stand out when pitched at their proper levels of mania...don't know who is behind it...don't know how we got here...so few links...young children should not watch youtube (applause).

I dragged myself back for another 3.5 compressed minutes but this...I dunno. I guess TED brings FUD to the normies? Its like having People magazine explain it. Like, he's right and bright but an artist and media consumption is a complicated nut, youtube being its least part.
posted by Ogre Lawless at 3:02 PM on July 13, 2018 [3 favorites]

The takeaway from this is a bit subtle and very important.. It's not just youtube.. it's everything.

Youtube is warping the minds of small children.

If you wanted to build a machine to identify and capture the most vulnerable teenagers, and then drive them toward suicide, you couldn't do any better than Tumblr.

Facebook and Twitter were largely responsible for hijacking an election and dramatically polarizing the country.

It's everything. Who knows what Metafilter is doing? Making us all sign up for the DSA? [in-joke for the megathread consumers]

How do we even fix that? Is it something that even should be fixed, or just gotten rid of entirely?
posted by Xyanthilous P. Harrierstick at 3:29 PM on July 13, 2018 [18 favorites]

I'm about a quarter into Bridle's book, which is good, but not as good as the recent interview he did with Novara Media which got straight to the computational management ideas (important) and then talked about the Youtube thing (obviously a problem, but not as important in the long run) http://novaramedia.com/2018/06/29/a-new-dark-age/
posted by The River Ivel at 3:46 PM on July 13, 2018 [1 favorite]

In all honesty, one of the main reasons why Metafilter is the only social-media-ish thing that I really hang out on is that it doesn't do all of the applied-sociology behavior-modification stuff. I know enough to know when sites and apps are trying to control my behavior and get me hooked on them, and I run the other way when that happens. It's turning me into a bit of a Luddite, but I think it's worth it to preserve my sanity.

I do do Instagram, but on my own terms. I scroll through my feed for a few seconds when I'm a bit bored, occasionally leave positive comments on other people's photos, thank people for making positive comments on my own photos, and about once a day I post a thing. That's about it. I can tell though that its owners at Facebook are trying to make it more and more "engaging" (at the cost of coarsening its core experience) and I already have my eyes on the exits.
posted by Anticipation Of A New Lover's Arrival, The at 4:00 PM on July 13, 2018 [9 favorites]

If you zoom out far enough, all forms of social interaction will modify the behavior of the people involved. It's peer pressure, it's being exposed to new ideas, it's enforced norms. Metafilter absolutely does that, if you don't believe me read the Site Activity Metatalk thread. That's not unusual, every community online or off does this, it's not inherently a bad thing.

Now, what modern social media platforms have done is use this basic aspect of human interaction as an engine to generate profits. At the end of the day all Twitter, Facebook, Tumblr, Youtube, etc care about is views. Views lead to ads which lead to profits. People use these services, they generate views, and the owners get rich. Twitter doesn't care if it has a white supremacy problem, because white supremacists generate more views. Facebook doesn't care if they have a fake news problem, because fake news generates more views. Tumblr doesn't care if they have a suicide problem, Youtube doesn't care if it has a content problem, all because making an effort to fix these problems is going to cost views and cost money. None of these companies want to spend the money it would take to seriously police their content because they've done the math and shitty content makes more money than quality content.

The only way out is to understand what it is you're doing when you engage with these networks, understand what they want to do to you, and protect yourself as best you can. Facebook and Twitter are feeding you a facile list of every stupid thing your "friends" are doing and saying. Youtube is feeding you a list of videos it thinks will keep you watching. Don't let them decide what's important to you.
posted by Mr.Encyclopedia at 5:35 PM on July 13, 2018 [6 favorites]

I probably say this every time this comes up but I'm far less concerned about kids running across disturbing cartoons on YouTube than I am about them running across flesh-and-blood far-right ideologues and conspiracy theorists.
posted by atoxyl at 6:08 PM on July 13, 2018 [4 favorites]

If the parents just selected and downloaded a bunch of non-creepy videos from Youtube with a stream downloader and ran them in a normal video player on random shuffle, would that work for the kids and the parents? Do we just need an easier way of doing that?

Even before algorithmic curation there's the whole premise of the modern internet where instead of just viewing documents or images or videos created by someone else, the standard thing is that some company's web server runs some software on your computer (in the form of a web page that's frequently 100% javascript) which maybe does some things you want or maybe doesn't, but definitely sends out a whole bunch of monetizable surveillance data about you and tries to force you to view ads. All of the meager benefits video-in-a-browser web sites provide to the user, like scaling pre-rendered video to the screen size or buffering the video stream in response to Comcast's crappy flaky network service, are rendered worthless if you're pre-selecting what to look at and watching it repeatedly.
posted by XMLicious at 6:36 PM on July 13, 2018 [6 favorites]

> If the parents just selected and downloaded a bunch of non-creepy videos from Youtube with a stream downloader and ran them in a normal video player on random shuffle, would that work for the kids and the parents? Do we just need an easier way of doing that?

Thinking about everything involved there makes *me* stressed out, and I don't even have kids.
posted by smelendez at 9:06 PM on July 13, 2018 [2 favorites]

We're super stingy with TV time but my kid likes those slow videos of cars being assembled. He gets 20 min a week max.
posted by St. Peepsburg at 9:28 PM on July 13, 2018 [1 favorite]

I probably say this every time this comes up but I'm far less concerned about kids running across disturbing cartoons on YouTube than I am about them running across flesh-and-blood far-right ideologues and conspiracy theorists.

You're in luck!
posted by Jpfed at 9:43 PM on July 13, 2018 [1 favorite]

It's everything. Who knows what Metafilter is doing?

It’s an echo chamber for pessimism and political gloom. If you spend enough time here, you start thinking the world is fucked up and everyone is undergoing therapy or taking anti-depressants. Which is. just. not. true.
posted by Kwadeng at 10:24 PM on July 13, 2018 [17 favorites]

Thank you for this video. It's a much more immediate distillation of Bridle's Medium post (also linked upthread) and Bridle's work should be shared far and wide.

What I find really spooky about all of this is that machines (AI) are permanently altering the psychology of at least two generations of human beings with these videos. Thinking about Facebook, Google, Twitter, YouTube, etc. I am reminded of Tim O'Reilly's quote of Tristan Harris's chilling summary: "We have pointed supercomputers at our brains".

Information technology has brought us to a game of Russian roulette and AI is the gun. At the end of the barrel is human social psychology.

Ante up.
posted by mistersquid at 12:01 AM on July 14, 2018 [3 favorites]

IDK, I'm not sure I agree. I thought the TED talk sounded kind of over-reactive and hysterical - especially when he used phrases like 'hacking young children's brains', as if that's not a clickbait phrase. Is he angry that people other than big corporations get to make media now? Or does he think that everyone on YouTube becomes an evil tool, bent on corrupting children, the second they upload a video?

For the record, I personally like unboxing videos. I find them relaxing in an ASMR kind of way.
posted by HypotheticalWoman at 2:10 AM on July 14, 2018

I think his initial thrust is that the most bizarre videos aren't really the result of someone deciding to express something to another human, they're the result of the uploader pursuing rewards, in the form of ad revenue, in response to signals sent by the system. That's one way you get the combination of design elements from media targeted towards children—bright colors, characters and keywords from corporate franchises—mixed with adult themes like sex and violence and quasi-sexual quasi-violent things that seem nonsensical.

It's a sort of cognitive hypertrophism developed through feedback loops, like the absurdly-proportioned sword-billed hummingbird and the flower it evolved to drink nectar from. The "up next" part of the algorithm gives the kids more of whatever they respond to, but at higher intensity, and the number of views and accompanying revenue prompts the creators to dial up the stimulation in subsequent videos they make, which the algorithm and selection behavior funnel more views to.

Basically he's arguing that we're probably going to get/have already gotten a similar effect to fashions for women's high-heeled shoes getting taller and taller in the 20th century until in some cases you get permanent deformity of the feet, vascular problems in the lower legs, and spinal injuries. That, but for children's brains instead of women's feet, with much faster and shorter cycles of feedback, and driven by recommendation algorithms and automated "monetization of content" heuristics rather than even just the lackadaisical degree of conscious thought put in by Mad Men type marketing and fashion executives.
posted by XMLicious at 4:32 AM on July 14, 2018 [8 favorites]

Just quietly, most public broadcasters (The ABC here in Australia, and the BBC in the UK) do amazing, well moderated, free streaming content for kids. My five year old son only ever watches Youtube with an adult present and engaged in what he's watching, due to his fondness for heavy equipment videos, and even then with an adult hand we come to weird as fuck shit. After a Malaysian video of a live action Joker running over toy cars snuck past us and made him sob horribly over the destruction of some perfectly good toys we dumped Youtube from his tablet altogether and have throttled data to everything other than ABC for Kids. Even then we turn off the wifi access to the tablet every now and then if he's getting too hooked on just sitting watching videos. The tablet was a gift from his grandmother and has a bunch of educational games that he loves so it's generally of net benefit to him.
posted by Jilder at 5:32 AM on July 14, 2018 [7 favorites]

My two and a half year old watches Youtube Kids. I really hate the interface and the lack of parental options. Though I finally set the 4 digit parental code to 0000 so I can block channels quickly. And I do block a lot of channels. Maybe 5 a day or more. I'm merciless. As far as I know there's no way to reverse a block once I set it. But 0000 because have you ever tried to block some pervy Elsa video from your 2 year old while they throw a tantrum?

My pet peeve is videos of kids educational games, you know, where they record themselves playing some software like a tablet coloring book and then their entire channel is 500 videos of just that. Watch one and Youtube Kids recommends them all.

In fact the whole algorithm for Youtube Kids is terrible like this. Watch one video of Masha and the Bear in Russian (perhaps not from the official channel, but someone posting their content,) and suddenly you start getting cyrillic alphabet videos a day later.

What Youtube Kids needs is a whitelist. Subjects that I, the parent, can set and have the videos skewed towards. I'd also love the ability to share blocklists and adopt other parent's blocklists.

I think about this subject a lot, my kid is over on the couch watching Peppa Pig while I type this. I'd like to share a few of the channels I like, if I may. They are the ones I consistently haven't blocked. The only toy channel I like is Tiny Treasures, as she never does unboxing and it's never about what toy's she bought, and also because she's got a very Mr Rodgers tone to her videos. We had been doing Teletubbies when my kid was younger, but now we've gotten into In The Night Garden as our preferred surreal English children's program. And my kid really likes Blippi, (though he sounds like a hyperactive Kinko the Clown to me.)
posted by Catblack at 7:25 AM on July 14, 2018 [1 favorite]

Catblack, the lack of a whitelist on any of the major services makes me want all of them to go diaf, because it actually wouldn’t touch their precious numbers at all. My kid will happily watch the same thing 20 times in a row, there is zero need to barf up something random from the barrel and hawk it at him just to keep his eyeballs on the screen.

It really goes beyond a whitelist, though. I’ve been kind of floored (but probably shouldn’t be) by how these services just exude a sense of indifference, pretty much through and through. It’s like they want to make it very clear that they Do. Not. Care. One. Bit about either kids or their parents.
posted by bjrubble at 2:20 PM on July 14, 2018 [4 favorites]

Downloading the videos and just having a folder full of video files that you play in a normal media player application or app is effectively a method of whitelisting... and although the files take up space which can be scarce on phones and tablets, there's the advantage that no internet connection is necessary to watch once they've been downloaded. Wikipedia has an article ”comparison of YouTube downloaders”, though of course it looks as though spam is constantly added and deleted from that article so make very sure you don't install malware.
posted by XMLicious at 4:39 PM on July 14, 2018 [1 favorite]

What was the "teacher tube" site posted a while back? It had endless educational content, including things like crash course, etc, but with zero white nationalist content. Sounds like a site to promote, let me just look through my favorites...
posted by eustatic at 9:11 AM on July 15, 2018

Here it is. It is actually called "Teachertube."
posted by eustatic at 9:14 AM on July 15, 2018

This guy's larger point isn't terribly dramatic nor specific. Framing it from the angle of kids youtube videos perks up everyone's concern, but as I said last time someone posted about the subject, if you can't be bothered to moderate your kids' media consumption, then that's on you, not youtube. I mean, woe is me, I can't plop my kid down in front of youtube and let him/her click on any link that comes up for the next 8 hours without seeing some weird shit? A lot of the concern that surprises me isn't so much harmful content, but stuff that's simply inexplicably weird. And really, I can think of far more serious concerns than my kids being exposed to weird things.
posted by 2N2222 at 5:36 AM on July 16, 2018

I think the "it's up to the parents" argument makes sense around a lot of media, but not YouTube.

It really is thoughtlessly designed, from autoplay to the recommended suggestions. I mean as people have complained about there's not even a way to whitelist. This seems like a basic feature for a concerned parent.

Even for the best-intentioned parent with unlimited time and patience, it would take inordinate effort to be sure they could avoid their kid seeing a disturbing video. The solutions people have proposed above are basically "don't use YouTube". Their design, their recommendation algorithms, and the incentives they've set for video uploaders completely undermine anything parents can reasonably do.
posted by vogon_poet at 11:19 AM on July 16, 2018

2N2222, that's exactly the kind of mischaracterization that I'm sure is rife among the teams of 20-somethings building these apps, which is the only believable explanation I have for how bad they are.

In reality, what happens is:

* The kid wants to watch a 2 minute video
* At 1:50 in the video, a "suggested" next video pops up on the screen
* 10 seconds later, that video automatically starts playing

So at 1:50, you need to have hands on the screen. If the popup ad for the next video works (which everyone on the other side of the pipe is sincerely trying to make happen) you might need to fight off a toddler who now want to see it. You then have to wait out the conclusion of the current video, because even though the kid has been distracted from it by your intervention and the subsequent battle, he will still be heartbroken and furious if you turn it off early.

Whew, you made it to the end of the video! Okay, what next? Obviously you don't just show them the suggested video -- you don't know what's in it, and clearly neither does YouTube. Maybe you'll find something in the list of suggestions; they're also terrible but at least there's more options. Usually, though, you go back through the front door and start looking from scratch.

Okay, all set! Enjoy your 1:50 of free time before it all starts again, you lazy slacker.
posted by bjrubble at 11:26 AM on July 16, 2018 [6 favorites]

Related as to how and why this is happening: Folding Ideas: Weird Kids' Videos and Gaming the Algorithm.
posted by koucha at 1:29 PM on July 18, 2018

I think YouTube may have changed its Chromecast app to autoplay the next algorithm-selected video rather than return to a home screen? Although perhaps I've changed the way I use Youtube/my Chromecast in some subtle way.

In any case, switching between HDMI inputs on my TV a few days ago I discovered YouTube still apparently running since the preceding day when I'd last used it and have confirmed experimentally that it no longer stops at the end of a single video. So I guess I'll always be downloading before playing something now...
posted by XMLicious at 4:57 PM on August 4, 2018

« Older This is for you   |   You mean Jurassic Park isn't accurate? Newer »

This thread has been archived and is closed to new comments