Democracy suffers if our news environment incentivizes bullshit
November 14, 2016 8:53 PM   Subscribe

Hyperpartisan Facebook Pages Are Publishing False And Misleading Information At An Alarming Rate. The rapid growth of these pages combines with BuzzFeed News’ findings to suggest a troubling conclusion: The best way to attract and grow an audience for political content on the world’s biggest social network is to eschew factual reporting and instead play to partisan biases using false or misleading information that simply tells people what they want to hear. This approach has precursors in partisan print and television media, but has gained a new scale of distribution on Facebook...

...The reality is that people who frequent these hyperpartisan pages on the right and on the left exist in completely different segments of the online world, rarely interacting with or seeing what the other side is seeing. The more they rely on these pages for information, the more polarized they will likely become — and the more their worldviews will be based on information that is misleading or completely false.

Inside Facebook’s (Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political-Media Machine
Such news exists primarily within users’ feeds, its authorship obscured, its provenance unclear, its veracity questionable. It exists so far outside the normal channels of news production and distribution that its claims will go unchallenged.

The “They Had Their Minds Made Up Anyway” Excuse
If Facebook was a tool for confirmation bias, that would kind of suck. It would. But that is not the claim. The claim is that Facebook is quite literally training us to be conspiracy theorists. And given the history of what happens when conspiracy theory and white supremacy mix, that should scare the hell out of you.

The forces that drove this election’s media failure are likely to get worse
In a column just before the election, The New York Times’ Jim Rutenberg argued that “the cure for fake journalism is an overwhelming dose of good journalism.” I wish that were true, but I think the evidence shows that it’s not. There was an enormous amount of good journalism done on Trump and this entire election cycle, from both old-line giants like the Times and The Washington Post and digital natives like BuzzFeed and The Daily Beast. (There were plenty of good broadcast reporters on the beat as well, though what appeared on air left a lot to be desired.) For anyone who wanted to take it in, the pickings were rich.

The problem is that not enough people sought it out. And of those who did, not enough of them trusted it to inform their political decisions. And even for many of those, the good journalism was crowded out by the fragmentary glimpses of nonsense.


Renegade Facebook Employees Form Task Force To Battle Fake News
“Facebook, by design, by algorithm, and by policy, has created a platform that amplifies misinformation,” said Zeynep Tufekci, an associate professor at the University of North Carolina at Chapel Hill, who has been vocal in the need for social media to consider their role in spreading fake news.

Google to Bar Fake-News Websites From Using Its Ad-Selling Software
Google said Monday that it is updating its policies to ban Google ads being placed “on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose” of the website. The policy would include sites that distribute false news, a Google spokeswoman said.

Meanwhile, Mark Zuckerberg Continues to Defend Facebook Against Criticism It May Have Swayed Election

Google Docs: False, Misleading, Clickbait-y, and Satirical “News” Sources
Tips for analyzing news sources:
- Avoid websites that end in “lo” ex: Newslo (above). These sites specialize in taking a piece of accurate information and then packaging that information with other false or misleading “facts.”

- Watch out for websites that end in “.com.co” as they are often fake versions of real news sources.

- Watch out if known/reputable news sites are not also reporting on the story. Sometimes lack of coverage is the result of corporate media bias and other factors, but there should typically be more than one source reporting on a topic or event.

- Odd domain names generally equal odd and rarely truthful news.

- Lack of author attribution may, but not always, signify that the news story is suspect and requires verification.
- Check the “About Us” tab on websites or look up the website on Snopes or Wikipedia for more information about the source.

- If the story makes you REALLY ANGRY it’s probably a good idea to keep reading about the topic via other sources to make sure the story you read wasn’t purposefully trying to make you angry (with potentially misleading or false information) in order to generate shares and ad revenue.

- It’s always best to read multiple sources of information to get a variety of viewpoints and media frames. Some sources not specifically included in this list (although their practices at times may qualify them for addition), such as The Daily Kos, The Huffington Post, and Fox News, vacillate between providing legitimate, problematic, and/or hyperbolic news coverage, requiring readers and viewers to verify and contextualize information with other sources.


Conspiracy theories: How to be a smarter news consumer.

This isn't an election thread. Current election thread is here.
posted by triggerfinger (117 comments total) 106 users marked this as a favorite
 


This is a perfectly timed post. I've just starting blocking all of the hyper partisan (on either side) 'news' sites from my Facebook, and replacing them with reputable journalists. I never cared before, but the bullshit that people hear and believe on facebook, including myself sometimes, is too much after this election cycle. If I'm the first person in my friends that stats reading the real news again, at least I can hopefully convince someone else to pay attention, too.
posted by motioncityshakespeare at 9:02 PM on November 14, 2016 [8 favorites]


Over the weekend I ended up on the Fox News site reading a story about the NYT letter re: reporting accuracy, truth, etc. After making the mistake of scrolling down to the comments I noticed a significant amount of people bemoaning that the "liberal rag" in question was in fact STILL lying because they were reporting Clinton as leading in the popular vote while that was no longer the case. Trump was now leading by 500k-600k votes, they insisted.

In my skeptical confusion I scrolled just to the top of the page to the live election counter on the Fox News website itself, which was accurately reporting the roughly 700k and growing lead Clinton still indeed had.

Earlier today I told my friend this story and even telling it gave me extreme anxiety about HOW to get through to people that are so willfully ignorant that they've completely abandoned all notions of critical thinking, research, analysis, etc. How do you deal with someone who won't accept, or won't SEE, the readily available information (from a source they trust) that clarifies their own misinformation.

Many on the left have made the point that you have to begin crafting arguments against Trump and figuring out ways to get the correct information to the categorically misinformed, but goddamn does it seem like a Herculean task at this point.
posted by dreamlanding at 9:16 PM on November 14, 2016 [50 favorites]


This tidbit from the NYT article is the most 2016 thing possible:
Then, of course, there’s the content, which, at a few dozen posts a day, Nicoloff is far too busy to produce himself. “I have two people in the Philippines who post for me,” Nicoloff said, “a husband-and-wife combo.” From 9 a.m. Eastern time to midnight, the contractors scour the internet for viral political stories, many explicitly pro-Trump.
posted by theodolite at 9:18 PM on November 14, 2016 [6 favorites]


I've been thinking about this a lot. I hope there are smart well-resourced people who'll start really taking this seriously; I wish I knew what the solution is.
posted by LobsterMitten at 9:18 PM on November 14, 2016 [7 favorites]


Hmm, maybe a crowd sourced Snopes.com?

Wiki-fact?

There's definitely a need.
posted by notyou at 9:26 PM on November 14, 2016 [4 favorites]


Forgot this one: The ‘Filter Bubble’ Explains Why Trump Won and You Didn’t See It Coming

In a 2015 study run by Facebook data scientists and published in Nature, researchers set out to test the filter-bubble hypothesis by looking at ten million de-identified Facebook users who self-reported their ideological affiliation over a six-month period. They found that users only clicked on 7 percent of “hard” content (politics, national news) in their feeds, as opposed to “soft” content like entertainment, sports, or travel. The researchers found that conservatives see about 5 percent less ideologically diverse content than their more moderate friends, with liberals at 8 percent. The Facebook algorithm, they concluded, makes it 1 percent less likely that people are exposed to cross-cutting content. More than anything, it’s the friends you have: “We show that the composition of our social networks is the most important factor limiting the mix of content encountered in social media.” While “news feed” is clever, sticky branding, it’s more “’my friend’s opinions’ feed.” Notably, the study with the largest data set on Facebook virality, out earlier this year, found that feelings of dominance predicted sharing, while arousal — getting angry or upset — predicted commenting.
posted by triggerfinger at 9:28 PM on November 14, 2016 [3 favorites]


goddamn does it seem like a Herculean task at this point.

Some of that is even just from the lack of basic civics knowledge. You have a conversation with someone, and they say something ...odd. So you probe a little bit and you find out they have no idea how Congress works, or something basic about the Constitution, or what have you. So you end up going back to step one and they argue with you about really basic factual matters and it's exhausting.
posted by Blue Jello Elf at 9:32 PM on November 14, 2016 [46 favorites]


In a column just before the election, The New York Times’ Jim Rutenberg argued that “the cure for fake journalism is an overwhelming dose of good journalism.”

Shortly after the election:

To Our Readers, From the Publisher and Executive Editor

As we reflect on the momentous result, and the months of reporting and polling that preceded it, we aim to rededicate ourselves to the fundamental mission of Times journalism.

Followed by all the things they wish they'd done, and an exhortation to pay them for good journalism. Phooey. That's the same paper that unflinchingly published bullshit that marched us to war in 2003.
posted by adept256 at 9:44 PM on November 14, 2016 [41 favorites]


I've been thinking about this a lot. I hope there are smart well-resourced people who'll start really taking this seriously; I wish I knew what the solution is.

I think that unless Facebook (and Twitter and Google) is on board, any effort that anyone else makes is like spitting into the wind. This kind of confirmation bias is a powerful thing. There are tools already in place; a few ideas thrown out in some of the articles above:

Buzzfeed: “There is a lot more we could be doing using tools already built and in use across Facebook to stop other offensive or harmful content,” said a second Facebook employee who has been a longtime engineer there. “We do a lot to stop people from posting nudity or violence, from automatically flagging certain sites to warning people who post content that doesn’t meet the community guidelines,” the employee said. He added that while Facebook users were encouraged to flag fake news, the guidelines for removing that sort of content were not clear. “If someone posts a fake news article, which claims that the Clintons are employing illegal immigrants, and that incites people to violence against illegal immigrants, isn’t that dangerous, doesn’t that also violate our community standards?”

Neiman Lab: What can Facebook do to fix this problem? There are ideas out there, many of them problematic in their own ways. One simple one would be to hire editors to manage what shows up in its Trending section — one major way misinformation gets spread. Facebook canned its Trending editors after it got pushback from conservatives; that was an act of cowardice, and since then, fake news stories have been algorithmically pushed out to millions with alarming frequency.

Another idea would be to hire a team of journalists and charge them with separating at least the worst of the fake news from the stream. Not the polemics (from either side) that sometimes twist facts like balloon animals — I’m talking about the outright fakery. Stories known to be false could be downweighted in Facebook’s algorithm, and users trying to share them could get a notice telling them that the story is fake. Sites that publish too much fraudulent material could be downweighted further or kicked out entirely.

Would this or other ideas raise freedom of speech or other thorny issues? Sure. This would be easy to screw up — which is I’m sure why Facebook threw up its hands at the pushback to a human-edited Trending section and why it positions itself a neutral connector of its users to content it thinks they will find pleasing. I don’t know what the right solution would be — but I know that getting Mark Zuckerberg to care about the problem is absolutely key to the health of our information ecosystem.

posted by triggerfinger at 9:45 PM on November 14, 2016 [16 favorites]


This is going to continue to be one of the most serious threats to democratic society. Right now, I can only foresee it getting worse.
posted by Vic Morrow's Personal Vietnam at 9:48 PM on November 14, 2016 [19 favorites]


Cite your source.
Cite your source.
Look for first-person sources. Check the timing of the article (old news may often be inaccurate).
Is it measurable (objective vs subjective, fact vs opinion)?
Who is paying for this information? Local news, advertisers, public relations groups? Are there ties with other, more biased groups or organizations?
Just because it is in print, does not make it valid.
Just because it is repeated in several resources (cut and paste), does not make it valid.
Information, entertainment, influence: which is the driving force for this particular news outlet?

Anybody can put anything on the internet. No accountability, no fact-checking, no consequences for sloppy or bias reporting.
A few well-run news outlets with high standards and both internal and external checks for accuracy are better resources. The problems are finding those resources, and staying alert for shifts in editorial policy.
posted by TrishaU at 9:52 PM on November 14, 2016 [7 favorites]


Oh look, we're catching on now to the glorious internet revolution and its freedom of boundless paperless expression that will surely bring on utopia.
posted by Miko at 9:58 PM on November 14, 2016 [24 favorites]


On source evaluation: something I picked up from university librarians is the CRAP Test. There are various versions out there to Google.

Currency
How recent is the information?
How recently has the website been updated?
Is it current enough for your topic?

Reliability
What kind of information is included in the resource?
Is content of the resource primarily opinion? Is it balanced?
Does the creator provide references or sources for data or quotations?

Authority
Who is the creator or author?
What are their credentials?
Who is the publisher or sponsor?
Are they reputable?
What is the publisher’s interest (if any) in this information?
Are there advertisements on the website?

Purpose/Point of View
Is this fact or opinion?
Is the creator/author trying to sell you something?
Is it biased?
posted by Miko at 10:01 PM on November 14, 2016 [72 favorites]


So I was reading the Buzzfeed article and got to this paragraph:

Left-wing pages wrongly claimed Putin’s online troll factory was responsible for rigging online polls to show Trump won the first debate, falsely said that Trump wants to expel all Muslims from the US and said US women in the military should expect to be raped, claimed that TV networks would “not be fact-checking Donald Trump in any way” at the first debate, and completely misrepresented a quote from the pope to claim that he “flat out called Fox News type journalism ‘terrorism.’”

And I'm like "uh... what?" I thought all of these things were true. I wish that Buzzfeed had shared their doc with their research because when I went to see if I could find out about the thing about the online troll factory (which I could have sworn was true) I found the original Occupy Democrats post and a couple of copies but nothing that seemed particularly trustworthy.
posted by bleep at 10:07 PM on November 14, 2016 [7 favorites]


Facebook pages are tabloids. Pure and simple. Get something outrageous on an image, post it and hope it brings enough new users to like, visit or whatever the end game is. When people only consume cheerleading from their own echo chamber and lock out every person even those on the "same side" but disagreeing on some things, they are missing out there's more than those echoes. A lot more. I get it is extremely taxing to deal with so much bullshit, but the alternative is thinking our own social circles are fully representative of the population.

But the problem is where to start cleaning up this mess - It's the post-factual age of dank memes and hot takes, after all. Everyone is so addicted to 140 char. opinions, video takedowns and easily digestible infographics, there's no room for nuance.

Maybe listing resources, journalists and personalities who can say "yes, but" without coming up with some unsourced bs or a straw man to look balanced could be a start. Maybe force Facebook to have a review editorial board where pages presenting an abnormal number of fact checking reports to carry a "this page is deemed untrustworthy" between header and every post until valid sources are presented.

But that's worth nothing if people aren't willing to change themselves or learn where's bullshit when they read it.

(Apologies if something does not make sense, writing on mobile)
posted by lmfsilva at 10:12 PM on November 14, 2016 [2 favorites]


Facebook's Fight Against Fake News Was Undercut by Fear of Conservative Backlash

Makes a kind of sense. Zuckerberg can't not know how confirmation bias works, or why people aren't engaging with content they say they're presenting. Publicly supporting Trump, also makes sense, as far as protecting FB. He's apparently been appeasing conservatives for a while, since they got irritated when he expressed some more humane views (which seem more aligned with stuff he's doing with his spouse, like that Cure Everything charity).

On a related note, until recently, I could access some NYT articles without subscribing or logging in, but now you have to log in to most news sites that do actual reporting using FB or Google. I'm not sure that's great for access and spread of information (while preserving anonymity). Although actually funding reporting somehow is also critical. Dilemma.
posted by cotton dress sock at 10:16 PM on November 14, 2016 [1 favorite]


The researchers found that conservatives see about 5 percent less ideologically diverse content than their more moderate friends, with liberals at 8 percent.

What does this sentence mean? Liberals at "8 percent less (?)"... compared to that "5 percent less" for conservatives? Less than what? Average? People grouped as being in the middle of the political spectrum? Is this supposed to be seen as a strong effect? It sounds weaker than I would have expected, actually (and I think the study authors agreed with that interpretation).
posted by atoxyl at 10:20 PM on November 14, 2016 [7 favorites]


This is going to continue to be one of the most serious threats to democratic society.

I agree. And I am deeply concerned -- probably this issues has shaken me more than anything else during this election period. It is a cultural issue, one of values and integrity... things that I used to think were widely held American values that made me proud to be an American. If it can't be fixed, it will be dark times. But how does one effect cultural change? That is what's needed.
posted by brambleboy at 10:24 PM on November 14, 2016 [4 favorites]


1980s: Tech will make you smart
1990s: Tech will make you free
2000s: Tech will bring people together
2016: oops shit sorry guys
posted by RobotVoodooPower at 10:28 PM on November 14, 2016 [143 favorites]


I think what makes me despair about this is that I feel like you can treat the symptom but not the disease. Maybe Facebook can stop fake news from distributing this way. I definitely think they should be making the effort. But the truth is that a lot of people in my life do not seem to care what the truth is. They don't care about the factual truth of news stories any more than they care about the factual truth of climate change or evolution or whether gay people are decent parents or anything else along those lines. They share stories like this like they share glurgy stories of Christmas miracles.

They were things people shared with each other to make themselves feel better--in this case, about supporting a candidate like Trump, but it's not like that's new. It's just the continuation of a trend of people sharing these sorts of stories to make themselves feel better about attending anti-gay churches, or to make themselves feel better about being scared of black people, or whatever. It's every story about the kid who almost got aborted who grew up to be a missionary brain surgeon, every story about a white person being a hero for doing some incidental bit of work in Haiti or whatever. I didn't think there were that many people like this in this country, but I know enough people who were like this long before the election that I'm not at all surprised by people behaving this way in this context. Whatever stories they have to invent to justify their prejudices, they'll keep doing it. It's easier with Facebook, but god knows they were doing it easily enough with email forwards before that.
posted by Sequence at 10:30 PM on November 14, 2016 [40 favorites]


Yeah, but as you say, Sequence, it's easier with Facebook. It lets id run riot.

(There was some kind of common ground when everyone was tethered to one of three 9pm broadcasts. People might have spun off into their own orbit after the fact, but they at least watched roughly the same things.)
posted by cotton dress sock at 10:40 PM on November 14, 2016 [8 favorites]


I'd be very interested to see Google let an independent nonprofit do a deep dive into their indexed pages to trace attribution back to original sources and pair that with information on who owns those sources, information on trustworthiness from reputable curators, users flagging false stories*, things like that to provide some kind of Context API other services can use to give users a nice red/yellow/green reputability system with a click through for more information on how that particular story was rated.

*If you could build a system that identifies a good amount of known reputable stories that can basically be used as test cases and you constantly audit more random stories by hand to fact check and find more known reputable stories, then reviewers who consistently flag reputable sources as false could be filtered out as noise and you might have a workable flagging system. And maybe the system is weighted more towards flags from longer term users with consistently good flagging, with brand new users' flags not being worth much, to discourage brigading to game the system. You could probably also spot brigading patterns as they emerge and pump the brakes.

idk, I don't know what the solutions are but having easy access to lots and lots of data to figure it out would be a good first step towards something that might help.
posted by jason_steakums at 10:42 PM on November 14, 2016 [3 favorites]


And hey, I don't know if this would be a huge nightmare for the mods but maybe we could get a flag on Metafilter for false news? It's not like we're immune here, especially when the comments start flying fast. I figure if you flag that, you'd better come prepared with sources to back you up, since it shouldn't entail sending the mods out on a research project. I know I'd rather have a comment of mine with an inadvertent link to false news scrubbed than letting it stand, I'd rather not contribute to shitty discourse because I got hoodwinked myself.
posted by jason_steakums at 11:00 PM on November 14, 2016 [19 favorites]


Gabler said it best, in his election post-mortem (emphasis mine):
He ran against what he regarded as media elitism and bias, and he ran on the idea that the press disdained working-class white America. Among the many now-widening divides in the country, this is a big one, the divide between the media and working-class whites, because it creates a Wild West of information – a media ecology in which nothing can be believed except what you already believe.
I simply compare this to sports reporting. It's some kind of tribalism, reinforced with each cycle, to log in and check the scores. We want to see that our team is doing well. I instinctively doubt reports that my favorite players are using PEDs, even though I tend to believe reports our rivals are. Are you ever going to convince someone not to root for their team?

For a long time, political reporting seemed different - more policy details, less about the horserace. It was ok to disagree with some aspects of the platform, or to learn something new in a discussion and change your thinking. But that might have been an aberration, and what we have now is the natural state.
posted by borborygmi at 11:02 PM on November 14, 2016 [9 favorites]


It's not like we're immune here, especially when the comments start flying fast.

Agreed. Some very smart friends of mine were sending around a plausible-but-false story about Mike Pence this afternoon. We're all vulnerable.
posted by Blue Jello Elf at 11:04 PM on November 14, 2016 [3 favorites]


Agreed. Some very smart friends of mine were sending around a plausible-but-false story about Mike Pence this afternoon. We're all vulnerable.

His real name is Race Bannon.

Go ahead, Snopes it! Nothing there = totally true.
posted by Sys Rq at 11:20 PM on November 14, 2016 [8 favorites]


FWIW, here's a pretty good rundown of Trump on Muslims (NBCNews), in particular banning muslims from entering the country (but not expelling all muslims from the US).

(And stick around to the bottom for Trump's 'evolution' on immigration, with some reflecting back to when Trump was comparing Ben Carson to a child molester. sigh.)
posted by kaibutsu at 12:06 AM on November 15, 2016 [3 favorites]


"Hyperpartisan"

One word in and it's already bullshit. Say what you will about the left, what they said Trump said, Trump said.

Enough with the false equivalence. The two sides absolutely have different patterns, different failure modes even, but false? Misleading? That's not equally distributed at all.
posted by effugas at 2:01 AM on November 15, 2016 [46 favorites]


*If you could build a system that identifies a good amount of known reputable stories that can basically be used as test cases and you constantly audit more random stories by hand to fact check and find more known reputable stories, then reviewers who consistently flag reputable sources as false could be filtered out as noise and you might have a workable flagging system. And maybe the system is weighted more towards flags from longer term users with consistently good flagging, with brand new users' flags not being worth much, to discourage brigading to game the system. You could probably also spot brigading patterns as they emerge and pump the brakes.

A Trusted Ministry of Information®

What did I read upthread? "...who has been a longtime engineer there..." the full ten years, huh? And his knowledge of informational content and a product, this is a discrimination, sensibility and education of an engineer...hmmm. I'm glad those librarians have learned their place.
These fictions of ownership of technological convergence, these convenient King Davids, biased to conservative view...noooooo.
posted by lazycomputerkids at 2:04 AM on November 15, 2016 [1 favorite]


It's not like we're immune here, especially when the comments start flying fast.

Yeah, if MetaFilter readers (present company included) were especially shocked by the win, even more than regular people, well... yeah. That's the Filter part.

An echo chamber isn't necessarily wrong in its specific groupthink. Some such places must be very right, indeed, as a matter of probability. But when challenges to that consensus, whatever it is, become unwelcome or overwhelmed, then the situation becomes unhealthy for everyone, and a kind of oblivious brain-rot sets in. You can only agree and be-agreed-with so much before you lose track of the wider world.

In other words, we done fucked ourselves.
posted by rokusan at 2:33 AM on November 15, 2016 [15 favorites]


I'm starting to think we need something like a licensing board and the equivalent of a bar exam for any organization that claims to be promoting news. Individuals can write what they want, but if you group together to promote, analyze, or report on world events and want to publish that information, then meet some basic ethical standards or be denied accreditation and the ability to put out that information under any banner beyond that of the individual who wrote it.

Yeah, I know there's problems with that too, but at this point I'm not sure those are worse than the chaos we have now.
posted by gusottertrout at 2:55 AM on November 15, 2016 [3 favorites]


Chaos: Better than the crystal.
posted by lazycomputerkids at 3:03 AM on November 15, 2016


Nthing that this is a symptom and not a cause.

Do y'all remember the build-up to the vote on the 2003 invasion of Iraq? It was part of the conversation in these elections.

Did anyone, anywhere – and I am genuinely interested, because I didn't see it apart from myself here on MeFi – bring up France?

You know, the major ally with a long history in the Middle East (not a happy one, but a long one) telling us "do not go to war in Iraq, you will cause a chain of events you cannot predict and that will not be good?" You know, the major ally who helped us win our revolution and wrote the Declaration of Human Rights? That ally? The one pretty much every American either ridiculed or ignored or just shrugged and said "well whatever happens, happens"?

You know, Freedom Fries? Cheese-eating surrender monkeys? That ally? The nuclear power with anti-terrorism experts based in their own home countries who were giving us factual information in a desperate bid to get us to avoid an endless war?

And you think this happened recently and is the fault of fucking Facebook?

If people want to believe something, and ESPECIALLY if they get to play beat-the-dead-horse with their favorite stereotypes ("wheee! cheese-eating surrender monkeys! it's factual too lololol"), well, fuck, they do it. I mean look, people here know we France members and there are still those who don't take us seriously when we post about France (we won't EVEN talk about Charlie Hebdo, DO NOT EVEN). But people will post unattributed drivel if it says "France is stupid and dumb and surrendered in almost every war LOL also they are racist LOL but they make good baguettes and French women are skinny." Thanks for conveniently forgetting that French people who aren't white exist.

Symptom. Not a cause. You want the cause, we have to take an uncomfortable look at ourselves and our human weaknesses. That is precisely what disinformation plays on.

You want a better world? Start listening to people who don't tempt you with facile, fact-free stereotypes. They're a lot less fun to read because you might feel uncomfortable with yourself.
posted by fraula at 3:11 AM on November 15, 2016 [76 favorites]


Symptom. Not a cause. You want the cause, we have to take an uncomfortable look at ourselves and our human weaknesses. That is precisely what disinformation plays on.

Sure, but changing human nature is a much more involved project, and we just might be running out of time for anything that likely to come to nothing, and so we hope instead there are other ways to mitigate the problem.
posted by gusottertrout at 3:19 AM on November 15, 2016 [1 favorite]


I agree with fraula. It isn't Zuckerburgs fault that a lot of people are incapable of critical thought. There really is a sucker born every minute.
posted by fshgrl at 3:48 AM on November 15, 2016 [5 favorites]


Speaking of easily-digested infographics, could/has someone put the"Tips for analyzing news sources"or the CRAP test into an easily-digestable, Facebook-friendly graphic that could be shared widely?
posted by Gin and Broadband at 3:51 AM on November 15, 2016 [8 favorites]


About twenty years ago I remember sitting in my Politics & Civics class in high school. We discussed media and our teacher talked about information gatekeepers. We were outraged: he meant to tell us that people out there decided for us what news we should hear about and what our opinion should be?

My teacher waited for us to calm down: "Without gatekeepers, you'd be flooded with stories and no way to tell if they were true or even relevant to you. You would hear about a new shoe shop in a remote region in China and cute cats in Argentina, and somewhere in between all these stories there'd be a warning about poisonous air in our backyard. You need gatekeepers."

These days I wonder if my teacher was a time traveller.
posted by kariebookish at 4:27 AM on November 15, 2016 [59 favorites]


I'm starting to think we need something like a licensing board and the equivalent of a bar exam for any organization that claims to be promoting news.
Something like printing press licensing, or perhaps "An Act for preventing the frequent Abuses in printing seditious treasonable and unlicensed Bookes and Pamphlets and for regulating of Printing and Printing Presses"?

Everything old truly is new again.
posted by Sonny Jim at 4:28 AM on November 15, 2016 [6 favorites]


Here's the thing though.

Show of hands - how many people in here acutally read a news paper? And now how many would say that they only get their news from The Daily Show?

It ain't Facebook that's the problem.
posted by EmpressCallipygos at 4:42 AM on November 15, 2016 [11 favorites]


Isn't the phenomena something like a more corrosive version of The Onion humor stories?
posted by xtian at 4:53 AM on November 15, 2016


It's worse than this. They are using text analytics to get rapid feedback on the effectiveness of their tweets and posts. Facebook and Twitter are both perfect for this.

https://www.technologyreview.com/s/602817/how-the-bot-y-politic-influenced-this-election/?utm_source=MIT+TR+Newsletters&utm_campaign=1953a7cd5e-EMAIL_CAMPAIGN_2016_11_10&utm_medium=email&utm_term=0_997ed6f472-1953a7cd5e-153981785&goal=0_997ed6f472-1953a7cd5e-153981785&mc_cid=1953a7cd5e&mc_eid=a82937b5fe
posted by Goofyy at 4:59 AM on November 15, 2016 [1 favorite]


Facebook, in itself, is definitely not the problem. Nthing the others who have brought this up. Its the news culture in this country. Its the complacency with being wrong that allows people to even use facebook as a news source. People don't actually want to learn, they want to be confirmed and assured in what they already know. It's the satisfaction of learning without any of the effort.
posted by FirstMateKate at 5:07 AM on November 15, 2016 [6 favorites]


Something like printing press licensing, or perhaps "An Act for preventing the frequent Abuses in printing seditious treasonable and unlicensed Bookes and Pamphlets and for regulating of Printing and Printing Presses"?

No, nothing like that, but, yes, that's part of what I was referring to as the problems with the idea.
posted by gusottertrout at 5:14 AM on November 15, 2016 [2 favorites]


I instinctively doubt reports that my favorite players are using PEDs, even though I tend to believe reports our rivals are.

I doubt my team are using PEDs because they are the Chicago Bears and there is zero evidence of enhanced performance.
posted by srboisvert at 5:20 AM on November 15, 2016 [12 favorites]


Also note the Great Lie that Hillary Clinton "stole" the primary election from Bernie Sanders. That festered and took root in the same way as these conservative stories, and did just as much to prevent her winning. How many Bernie supporters in WI, MI, and PA stayed home, wrote his name in, or voted third party because of this alleged "crime"?

Welcome to the Age of Great Lies.
posted by (Arsenio) Hall and (Warren) Oates at 5:31 AM on November 15, 2016 [30 favorites]


The only success I had this season arguing against the "Hillary has TBI/Parkinsons" fabrication with a fellow church-goer was to compare it to spreading a junior high rumor that Mary Kate was pregnant.

Say what you will about a Catholic grade school education, but damned if those nuns didn't hammer the concepts of gossip, rumor and innuendo into our little heads. Shame was definitely in their arsenal, and misogynistic imagery was definitely used. (gossiping crone, old washer-woman, et al). Honestly, I don't remember exactly how they did it, but the experience leaves me believing that there was some effective, old-school framework out there for teaching little ones how to deal with the reception and spread of misinformation. Or maybe they were just using fear and shame to reinforce social mores against gossip. Or maybe they were actually introducing us to critical thinking.
posted by klarck at 5:35 AM on November 15, 2016 [7 favorites]


Oh look, we're catching on now to the glorious internet revolution and its freedom of boundless paperless expression that will surely bring on utopia.
posted by Miko at 9:58 PM on November 14 [4 favorites +] [!]


it was going pretty well until everything went IPO and capitalism took over. RIP Google Reader.
posted by eustatic at 5:48 AM on November 15, 2016 [5 favorites]


Should footnotes and bibliographies be dismissed as elitist pedantry? Perhaps we should be training our students in the art of constructing compelling internet memes founded on fantasies? Or forceful slogans that combine emotive power with a strategic absence of content?

If we aspire to educate policymakers of the future, are these not the skills demanded by our age? For £9k a year, might the subtle art of articulating effectual nonsense be preferable to the ineffectual tools of argumentation?

For any academic this should be, of course, no less than a vision of hell. But with 2016 and its strangest of political events, such visions might drift across our consciousness.

The alternative vision is a world in which a renewed sense of cause-and-effect asserts itself. That is, a situation in which it is realised that ignoring peer-reviewed expertise, evidence, and critical arguments does have consequences.

Getting just anyone to fix your plumbing leads to leaks. Ignoring experts in international relations leads to wars. DIY dentistry leads to painful tooth-loss. Dismissing the predictions of climate scientists increases the potential for disaster. Saying simply “X means X” means nothing at all.
David Tollerton, In the age of Trump, why bother teaching students to argue logically?, The Guardian (15 November 2016).
posted by Sonny Jim at 5:54 AM on November 15, 2016 [5 favorites]


It seems just plainly true that this phenomenon is worse in conservative circles than liberal ones. But if you think Bernie Sanders hasn't also aided in the erosion of trust of the mainstream media, then you haven't been paying attention. Right now he's trying to amass power by hammering the same bogeymen as the alt-right -- Elites + Wall Street + Media + Establishment Democrats. Lump those shadowy forces together, and waddya know, the media becomes something not to trust, and Great Lies sprout. The new Purity Tests on the left are disconcerting, and we're going to have a fractured and weak party in 2018 and 2020, likely beyond.
posted by (Arsenio) Hall and (Warren) Oates at 6:04 AM on November 15, 2016 [25 favorites]


These days I wonder if my teacher was a time traveller.

kariebookish, this is how I'm feeling about my high school teacher who, in 2000, assigned us Postman's Amusing Ourselves To Death, a whole bunch of early psychology excerpts, Darkness At Noon, Catch-22, plus a smattering of Conrad and Eliot and Vonnegut. Same teacher also had us bring in any family members who'd lived through important historical events, so we got an hour with a Holocaust surviving grandparent and another with my friend's dad who got out of Russia when the USSR fell.

I'm not sure how this is replicable on a broader scale, but it sure as hell provided a great foundation for bullshit detection.
posted by deludingmyself at 6:30 AM on November 15, 2016 [6 favorites]


> I doubt my team are using PEDs because they are the Chicago Bears and there is zero evidence of enhanced performance.

Lulz.

Bears WR Alshon Jeffery earns four-game suspension for PED violation

Which just makes it even worse. :(
posted by jammer at 6:40 AM on November 15, 2016 [3 favorites]


Related: Twitter finally steps up its abuse game. We'll see how good the follow-through is, but this could literally save lives in the current political climate.
posted by Andrhia at 6:50 AM on November 15, 2016 [2 favorites]


Mike Caufield made a series of posts on this topic that were insightful for me. Usually an instructional design / learning / LMS / open books blogger.

Quote: The great challenge of our age is to graduate students who can either thrive in iexisting information environments or design better ones. We give our students four years practice doing library research and yet do not educate them about the environment in which they will gain much of their civic and personal knowledge. We must critique these environments at a level deeper than “Facebook is a corporation and therefore bad.

Facebook's underlying model is problem
They had their minds made up anyway excuse
Facebook and Twitter are probably making Google a liar as well
Fake news does better on Facebook than real news
Facebook broke democracy but the fix is harder than people realize
posted by typecloud at 6:53 AM on November 15, 2016 [9 favorites]


David Tollerton, In the age of Trump, why bother teaching students to argue logically?, The Guardian (15 November 2016).

All this made me think of is a couple of convos I've had or witnessed over the past couple of days. One person was for the most part arguing logically, using citations, posting links etc to back up what they were saying. The problem wasn't the lack of logic but that there is enough bullshit out there to back up what they were saying. So you end up having what on the surface matches the pattern of 'logical' argument but with one side using information from Infowars or Briebart as a citations.

If someone is discussing climate change and argue climate change is hoax and BS articles articles exist like the one I saw today-- 'Over 30,000 scientists are now claiming that it is made made hoax' then they're there to support their arguments. This person is having a 'logical' argument in their mind. And no amount of saying it's bad source seem to matter.

There's a whole eco-system of bullshit out there that provides support for logical arguing. It's a huge problem and I don't have any idea about what to do about it. The only thing you can do is repeat over and over 'not a good source', 'not a good source' or something similar.
posted by Jalliah at 7:04 AM on November 15, 2016 [21 favorites]


This is a really complicated problem, but I think part of it lies in the decline of newspapers and journalism, and in the way "new media" is incentivised. While a lot of bullshit sites exist to promote a particular ideology, they're helped by being able to make a lot of money doing so. Funding journalism through advertising is probably a bad idea that we need to move away from somehow.
posted by destrius at 7:14 AM on November 15, 2016 [4 favorites]


Tim O'Reilly (yes, THAT Tim O'Reilly) had a great post paralleling the current fake news issue and the content farm issue of 2007. He explained how Facebook could kill fake news with an algorithm update like Google Panda, which killed most content farms. (Including the one I had worked at.)

Note: Google Panda actually caused Google's stock price to drop temporarily. They were willing to take the hit to resolve the problem. Let's see if Facebook feels the same.
posted by rednikki at 7:46 AM on November 15, 2016 [14 favorites]


Facebook broke democracy but the fix is harder than people realize

Kill Facebook. Problem solved.

The first step is quitting.
posted by Sys Rq at 7:48 AM on November 15, 2016 [10 favorites]


Like fraula, I find it strange that this seems like a new phenomenon to anyone, regardless of whether Facebook may have facilitated or exacerbated it.

The Oxford Don[? I don't want to leave this window to check but I think that's what he was] complaining that scholarly standards and citations aren't getting through to people? and that his time would be better spent teaching students to produce soundbites? blew my mind in particular. What kind of bubble has this guy been living in? Has he ever actually met anyone, in his entire life, outside of an elite academic institution?

YOU HAVE TO GET USED TO THE WAY THE WORLD WORKS. This is the level of discourse at which the average person operates, including intelligent people. I spent enough years sharing offices with brogrammers who "liked to debate ideas" at the top of their lungs all day, every day.

They honestly didn't seem to get that it's not legitimate to just make stuff up and present it as fact, in part because they seemingly didn't understand that that's what they were doing.

They were *tolerant* of scholarly standards, sure; but the way they put it was: I'm just as clever as if I did have a degree, so why act like this source is better than my list of newspaper articles and webpages just because the other guy has a piece of paper and I don't? The ones with degrees wouldn't put it this way, they'd just keep insisting on their point *as if* it were factually true.

And these are the people who *were* smart. They *were* insightful. Some of them introduced me to new fields of knowledge that permanently changed the way I think. But they weren't consistent. One of them wrote an impassioned article claiming "And Cited Author said X" when in reality, Cited Author had said Very Definitely Not-X. And when I pointed this out, he got upset and defensive and said that's not fair I've done more to promote the cause of Cited Author than anyone else, I've argued online for hours every day, I have followers, people listen to me, plenty of other experts are pro-X, a lot of people think Cited Author is completely dispensable to the cause of CitedAuthorism, so why are you mocking my efforts just because I'm not in an ivory tower like you?

And I said, you may be a great advocate, and experts in the field may be pro-X, and Cited Author may be dispensable, but you claimed here that Cited Author SAID X, and the FACT is that Cited Author said NOT-X. And I think I got through to him, but he would have preferred I didn't.

And guys like these were thinking way way above the level of Joe or Jane Average, they could easily have understood scholarly standards if they'd wanted to. They just didn't want to. If they didn't want to, why should Joe and Jane Average, who don't perceive themselves as Great Thinkers, and don't Like To Debate Ideas, want to learn scholarly standards of discourse? Of course they don't. Why should they, because not being scholars, that's actually not their job. But if it is your job, you have to accept that that's how the world works, and some of it is because people honestly don't know how to tell truth from appearances, but some of it is also because they don't care.

By insisting on what they want to be true, they may be exercising the only power they have to live in the world they prefer. It's not that hard to understand, because there is nobody who *doesn't* do this! Even those of us who try harder are still doing it, just in subtler forms, because it's an inescapable fact of human nature.

One way for [?Oxford Don] to deal with the shocking new world he lives in, might be to learn and teach rhetoric, which seems to be one of the most used but least taught skills of our time. Then he might have a chance of communicating more effectively without making himself feel dirty in the process.
posted by tel3path at 7:59 AM on November 15, 2016 [19 favorites]


I've found that following actual journalists on Twitter is the best way to get news. But my criteria for "actual journalist" might be different than another person.
posted by RobotVoodooPower at 8:00 AM on November 15, 2016 [6 favorites]


Fake news is clearly not a new phenomenon, but, to draw a tech comparison from a previous age, the difference between the modern information climate and ones of ages past is akin to the difference between an individual ping-flooding a game server and a worldwide botnet DDoSing most of the internet. The new order is the Gish Gallop meets Grey Goo.
posted by tobascodagama at 8:04 AM on November 15, 2016 [8 favorites]


And I don't think it's coincidental that we actually saw botnet DDoS attacks bring down several major attacks this year, either. A lot of the propagation of fake news can be automated in similar ways, and automated spam detection systems can no longer keep up with some of the more sophisticated social media bots.
posted by tobascodagama at 8:06 AM on November 15, 2016 [3 favorites]


Twitter finally steps up its abuse game.

Upon review of the link - that feels like literally the least thing they could have done.
posted by EmpressCallipygos at 8:10 AM on November 15, 2016 [4 favorites]


The new order is the Gish Gallop meets Grey Goo.

Yeah, it reminds me of when I first encountered IRC -- "Man, this is some noisy chaotic shit." Now the entire world seems like a great IRC channel.
posted by RobotVoodooPower at 8:12 AM on November 15, 2016 [2 favorites]



kariebookish, this is how I'm feeling about my high school teacher who, in 2000, assigned us Postman's Amusing Ourselves To Death, a whole bunch of early psychology excerpts, Darkness At Noon, Catch-22, plus a smattering of Conrad and Eliot and Vonnegut. Same teacher also had us bring in any family members who'd lived through important historical events, so we got an hour with a Holocaust surviving grandparent and another with my friend's dad who got out of Russia when the USSR fell.


I first got on the Internet in 1993.

In the last 23 years, I've seen the Internet evolve from primarily a text based interface, which encouraged long form writing, and line-by-line-dissection of long form writing, to a medium that is primarily pictorial, and increasingly video based, where close reading of long form stuff is discouraged by code and design.

Postman himself declined even to look at the web in 1999, when I started reading his books, and he passed away in 2004. In 1999, I thought the Internet was part of the solution to the problem he was writing about. Today, the Internet is a bigger problem than television and the movies.

And the shift towards pictorial communication is rapidly approaching levels where it's a threat to civilization. Fark and Failblog are good amusements, but even the briefest look at 4chan will show you, we're in deep shit.

The thing is, the Internet isn't just the Internet.

Twitter can be criticized not for how well or how badly the company polices its users, but also in a medium-is-the-message way. Twitter is a cesspool of hate because the 140 character limit makes it useless for more than playground name calling.

Facebook is part of the problem not just because of Facebook's policies but because it's an interface that pushes blogging into microblogging.

Compare to Livejourbal, Wordpress, or dare I say it, Metafilter, all of which are better by design. (But don't get too smug. There's room for improvement. And it's still all stuff you're looking at with a tabbed browser, and it's always tempting to tab away.)

Or compare to Instapaper. I've removed Facebook and Twitter from my phone, and now when I;m out and about and being a phone zombie, I'm at least reading 1000-2000 word essays that I loaded onto the Instapaper app previously. It's made me a lot happier. But I am sad to report that I still have not regained the long form writing ability I had 15 years ago.

So, Mefites, a lot of us built this here Interweb. A lot of you are the bearded Unix gurus who built the backends in the 90s. I'm only lacking the beard. We have to do something about all this.
posted by ocschwar at 8:19 AM on November 15, 2016 [26 favorites]


But, back in the days before the internet, people either got their preferred wrong information from their preferred newspaper (anyone who thinks the legit press doesn't misreport things is wrong) or else just pulled it out of their arse.

I'm not saying that spreading identifiable pieces of misinformation isn't worse, but take away Facebook and people would just do the same thing in other forms.

Also, xenophobia in Britain didn't start when we joined the EU. Astonishingly, people in Britain hated foreigners before that too. For taking their jobs, housing, &c. And they spread rumors to support their opinions, if only by word of mouth. Did you hear the one about the Pakistani family who were found wandering along the central reservation of the motorway? When they were picked up by police they could only say one word of English: "House. House." And so, just like that, the council gave them a five bedroomed house, and that's true. It is it's true. I heard that more times than I care to count, some years before the word Internet was in anyone's active vocabulary, and some years before 1992.

Now I take the point that if spreading misinformation weren't useful, propagandists wouldn't bother to do it; and if word of mouth were enough, they wouldn't be automating it. I don't doubt that a lot of the stuff I'm seeing online is coming from propagandists foreign and domestic. I think now is a very good time to be suspicious of the divide-and-conquer infighting that appears to be spreading in liberal circles right now, and to be more venomous than ever: it's working, and it's very possible that the calls are not coming from inside the house.

But it's all leveraging human behaviours that have been the norm for as long as I can remember, and I'm as old as my nose but a little older than my teeth. Just as xenophobia isn't a new phenomenon since globalization, wilful ignorance isn't the snake in some Garden of Eden where scholarly discourse has always held sway up to now. People have always been pig-ignorant, they just have more opportunities to flaunt it.
posted by tel3path at 8:20 AM on November 15, 2016 [7 favorites]


There needs to be some legal remedy for publishing falsehoods, without infringing on freedom of speech or the press. It's a hard line to walk, but it would be good to start getting closer to it.
posted by jetsetsc at 8:22 AM on November 15, 2016 [2 favorites]


Show of hands - how many people in here acutally read a news paper? And now how many would say that they only get their news from The Daily Show?

How often is the Daily Show actually factually inaccurate? I do have a shortage of media in my life that represent conservative viewpoints--but I get a lot of conservative viewpoints daily from people like coworkers, I don't necessarily need them in my news media. Metafilter didn't get an inaccurate picture of the election from the media we consume; that media and Metafilter both got an inaccurate picture of the election from the polling results. I knew full well just how horrible people out there were, even if I wasn't reading their media. I just didn't know how many of them there were, which is different. I don't need to consistently read conservative media to know their general take on things. The echo chamber can potentially be a problem if it becomes an absolute, but this is a false equivalence, I think.
posted by Sequence at 8:23 AM on November 15, 2016 [14 favorites]


I have come to the same conclusion about misinformation that I have about racism: That I don't care what people's intentions are. It is just as impossible for me to tell a minsinformed right winger from a liar as it is for me to tell a racist from someone who is being ironically racist, or is just ignorant. Because in the end it doesn't matter -- the results are the same.

And with misinformation, the behavior is the same. Someone will make a false claim on Facebook -- for example, that the Muslim Brotherhood has a plan to overthrow the US government, or that Obama is a secret Muslim, or whatever. But if you link to sites that demonstrate that these are not facts, the person will become abusive, say the other side does it too, change the goalposts, the whole range of dissembling, dishonest behavior that liars engage in.

So I just proceed from the assumption that these people are liars, and are lying for propagandist purposes, because they do not share the liberal notion that we can have different opinions but not different facts.

And I think comparison to MetaFilter was unfair. We weren't in a bubble created by liars. We were working with the best information we had, which has shown to be credible in the past. We tend to be pretty good at sussing out falsehoods and we stop repeating them when they are shown to be false. And we were largely correct: Not only did Clinton win the popular vote, but did so by a number of voters that is well on its way toward being more than any male candidate has ever gotten, but for Obama.

I am not sure how to deal with liars, except to fact check them an not engage them further. But their bubble is one of deliberate lies, which may be why they went with the world's biggest liar as their candidate. And I think they know he is a liar, and do not care, because they are liars too, and don't mind lies that are in service of their ideology.
posted by maxsparber at 8:32 AM on November 15, 2016 [26 favorites]


And the shift towards pictorial communication is rapidly approaching levels where it's a threat to civilization.
Yes. And part of watching that evolution in communication modes is a realisation that the (formerly?) text-based and reading-based nature of the web was simply an accident of limited bandwidth. Images, live video, and other, more augmented forms of "live-ness" are the future of web communication, and that's a future in which reading and text have themselves become essentially deprecated by the falling cost of bandwidth. Which leads me to ask further: are literate cultures themselves simply accidents of a low bandwidth world?

I do think about the famous Tor Star Wars illiteracy article rather a lot.
posted by Sonny Jim at 8:36 AM on November 15, 2016 [11 favorites]


I was saying to my 8th grader's history teacher how glad I was that they were using reliable and unreliable primary and secondary sources with part of the assignment not just: what was the Triangle Trade but also what do you think about this journal by a doctor on a slave ship or this Portuguese textbook.

Not only as a historian (how many of us will be historians) but to begin to critically appraise the news, posts, unsourced anecdotes which she will be saturated with as an adult.
posted by shothotbot at 8:55 AM on November 15, 2016 [7 favorites]


I have come to the same conclusion about misinformation that I have about racism: That I don't care what people's intentions are.

I sort of disagree with both parts. The effects are absolutely the same regardless of the intentions, but the way one fights it may be different. A hardcore racist or propagandist knows what they're doing and can't be reasoned with. Someone who is propagating racism or misinformation more thoughtlessly might be steered to a better place.
posted by Blue Jello Elf at 9:07 AM on November 15, 2016 [4 favorites]


I keep hearing that, but nobody has offered any concrete tips as to how to steer, and literally nothing I have tried has worked.
posted by maxsparber at 9:10 AM on November 15, 2016 [3 favorites]


I mean, I don't even think anyone has a clue how to tell one from the other.

If these are to be our tactics, we'd better come up with a real plan besides constantly telling people they need to work harder at reaching out to people who may or may not be racists and liars.
posted by maxsparber at 9:11 AM on November 15, 2016 [6 favorites]


"Kill Facebook. Problem solved."

Wrong. This is equivalent to trying to stop global climate change by encouraging people to take shorter showers.

We need more systems thinking, leading to bigger solutions than individuals can effect. We're trained not to think this way in the US, to think that government is ineffective, and that therefore all large, system level actions are as well. But it's simply not true; otherwise big corporations wouldn't exist and wouldn't have the power that they do.
posted by kaibutsu at 9:13 AM on November 15, 2016 [12 favorites]


"But if you link to sites that demonstrate that these are not facts, the person will become abusive, say the other side does it too, change the goalposts, the whole range of dissembling, dishonest behavior that liars engage in."

Two big factors here. First is defensiveness (white fragility), brought on because almost everyone knows racism is bad, and getting called out on racism is therefore really hard for people. It's much easier to deflect and defend than to accept the criticism.

Second, this is going to be exacerbated if the conversation is on a public Facebook wall.

My own plan going forward is to talk on the side with people, and relate my own personal experience with being called on racist statements. Pointing out the racist statement is only the first and smallest step of helping to change minds. As SURJ puts out, we need to call people in, not out.
posted by kaibutsu at 9:19 AM on November 15, 2016 [2 favorites]


The CRAP and other frameworks are kind of helpful, but I think there is something more central to bad vs good news that we should be able to pick up on and help other people pick up on if we can put our finger on it. It's hard to distinguish sources making good-faith efforts to analyze and present coherent ideas from those that are cynically manipulating their audience. And of course, even those sources generally making a good-faith effort are not perfect, and they make intentional or unintentional mistakes sometimes that manipulative sources eagerly point out makes them all just as bad.

A good-faith journalistic effort isn't something we necessarily look for, and it doesn't always easily fit into the CRAP or other frameworks mentioned above. But I think it's a useful concept to frame the discussion. Just look at the Buzzfeed article and the response from Right Wing News:

1. The RWN response makes lots of threats and just generally shows a lot of anger.
2. I checked up on its claim that the Clinton Foundation barely made any charitable donations, which the RWN article cites as an example of news they 'reported' that is actually true. Well, turns out it IS factually true but it's a completely misleading fact (because as a charity they actually do things with their money). Of course it's not like RWN asked the Clinton Foundation for a response.
3. CharityNavigator.org now has hundreds of comments on the Clinton Foundation page of people who are very mad a

So maybe here are some things to look out for:

Does the author:
- have an obvious 'enemy' or presume ill intent on the part of another?
- consider alternative viewpoints or explanations?
- actually tell the truth and use sources that seem reputable (but then again, what does that even mean in this day and age, other than being reasonably well informed and making a good faith effort)
- (what else?)
posted by ropeladder at 9:20 AM on November 15, 2016 [3 favorites]


But, back in the days before the internet, people either got their preferred wrong information from their preferred newspaper (anyone who thinks the legit press doesn't misreport things is wrong) or else just pulled it out of their arse.

but also, back in the days before the internet, people were forced to share their preferred wrong information verbally, or by posting clippings to each other. And the act of cutting something out of the paper, stuffing it in an envelope, finding a stamp, and sending it to your niece in college was often too insurmountable for some and they didn't bother. But now you can just click a couple buttons and it's on her Facebook wall.

It isn't so much Facebook alone that is causing the issue - it is the convenience which Facebook offers when it comes to sharing that information.
posted by EmpressCallipygos at 9:27 AM on November 15, 2016 [10 favorites]


I keep hearing that, but nobody has offered any concrete tips as to how to steer, and literally nothing I have tried has worked.

My relatives have come around on biased policing, but it took fifteen years of bringing it up gently over and over and over.
posted by Blue Jello Elf at 9:28 AM on November 15, 2016 [7 favorites]


My relatives have come around on biased policing, but it took fifteen years of bringing it up gently over and over and over.

On a personal level, I applaud your dedication, and am glad for your success. As far as its real-world application goes -- yikes.
posted by maxsparber at 9:36 AM on November 15, 2016 [8 favorites]


The way to tell the difference is, when you point them to new information, they take it into account and seem to reconsider even if they don't change their minds on the spot.

Some people really do have the wrong information. I know someone whose only real source of information is the mainstream press (I mean, in theory they could go to a library and start requesting books and studying up, but they're not going to) so of course what they read there seems like fact to them. They don't have anything else.

And actually, even some racists are not completely closed minded and may be acting in part out of genuine error; it can be possible to get through to them, at least for a moment. Here's an example by anecdote - I'm going to focus on an anecdote that doesn't contain racism so it's easier to stomach.

Someone was really weirded out and perturbed by the way her teenage nephew spent his time. From her point of view, he was locking himself away for hours in his room playing a computer game, and talking to the computer, and the computer was talking back. She found that utterly disturbing and freakish. I mean, given that that's what she thought was going on, I don't blame her. She thought he should get out and socialize.

I explained to her that it wasn't the computer talking to her nephew, it was other players, who were all real people. The way you play these games is, you turn yourself into a cartoon in the game, and you interact with all these other people who are also representing themselves as cartoons. So the voices talking back to him are all other people who are just as real as he is.

So all the time he's playing this game, he *is* socializing. I get why you're concerned that he should be socializing in person with people his own age, but teenagers can be a bit menacing with the peer pressure - you know how kids try to force each other to drink, or maybe they make fun of his appearance and stuff. The other players in the game can't force him to do anything, because they're not in the room with him. Maybe playing this game lets him make friends without getting in over his head.

There were a few other points of discussion that came up during the brief time we were thrown together, and I had a few openings to politely question her assumptions. Was she misinformed? Yes, in many ways. Was she a racist? Yes, that was pretty clear. Is it likely that I changed that? I doubt it, especially since she was actually kind of mean-spirited in general. But at least when I brought up something she hadn't thought of, she was open enough to go "huh! I never thought of that!"

People are complex.
posted by tel3path at 9:38 AM on November 15, 2016 [7 favorites]


The problem with The Daily Show and indeed any other mass-marketed news/entertainment source is that it can never be too challenging, too confrontational, too discomfiting. The Daily Show has a target demographic, and the producers and network execs know that. So they may be loath to include content that makes that demographic uncomfortable. News shows, radio shows, websites are all businesses with bottom lines, and they must put out content that (to varying degrees) makes their audiences feel vindicated and righteous. At some point, profit margin must supersede nuance; appeals to emotion must supersede sober discourse.
posted by Vic Morrow's Personal Vietnam at 10:54 AM on November 15, 2016 [2 favorites]



Shows like the Daily show are pretty clear about their bias and what they are about, including appeals to emotion. That's what they are and are meant to be. Daily Shows type programs are not part of the problem because they aren't anymore then what they profess to be.

If they were professing to be challenging and without bias news shows then that's different. They're not.

Now if people watching them think they're a be all and end all 'news source' and are more then what they profess to be that's not the shows fault, that falls on the viewer.
posted by Jalliah at 11:03 AM on November 15, 2016 [2 favorites]


It's unworkable to think that we can self-govern huge populations this way, but social psychologists often talk about the value of face-to-face contact in negotiations. We don't get face-to-face accountability with the internet (yet?), and our opinions are free to run as wild as an angry driver chasing and honking you for some perceived slight on the road. Typing in little comment boxes and interacting in this abstract way will always carry an undertone of inhumanity.

I'm a strong proponent of trying to figure out how to convince everyone to get involved in hyperlocal governance, administration, and the politics that bridge them. It's more straightforward to digest information in a room of 20 of your peers--and then send out the report of your agreements and disagreements to the larger population--than to expect to have a real conversation of substance in a format like social media. I mean, look at threads like this. I have no idea how to make it clear that I'm responding directly to someone without linking their user name or italicizing their comment, and, what, hoping they read down this far? Is that how we engage one another? It's not, it doesn't work. This is a blizzard of opinion and thought, nary a narrative structure to be seen. It's more convenient than scheduling face-to-face contact, though, so it'll stay a de facto part of our lives.

My commitment to my neighborhood, to my city, is that I'll spend the next year going to more local hearings and events. I want to know my representatives better. I want to talk to them about this situation. And I'll come back here to report on it, I suppose, but I don't expect the internet or journalism to save us. I expect us to put vested interest in helping ourselves at great expense to our sense of control over our free time. I see no other way around it.
posted by late afternoon dreaming hotel at 11:24 AM on November 15, 2016 [5 favorites]


In addition to the Google doc in the front page post, there is this list of fake news websites from all around the world. It's a couple of years old, but I guess none of those websites have become a bastion of truth.
posted by spheniscus at 11:26 AM on November 15, 2016 [1 favorite]


I'm as pessimistic as anyone about the inherent constraints of the human condition. It's not hard to notice that we've had one bloodbath after another, with some pauses in between, bright spots here and there.

But I think history also tells us that our better moments have depended, at least in part, on people fighting for particular vehicles of civilization.

There's not going to be a permanent cure, it's a question of symptom management.
posted by cotton dress sock at 11:55 AM on November 15, 2016 [2 favorites]


The public radio show On The Media has a handbooks section that has useful sharable summaries. They're infographic sized.
posted by ZeusHumms at 12:23 PM on November 15, 2016 [5 favorites]


We need more systems thinking, leading to bigger solutions than individuals can effect. We're trained not to think this way in the US, to think that government is ineffective, and that therefore all large, system level actions are as well. But it's simply not true; otherwise big corporations wouldn't exist and wouldn't have the power that they do.

They exist and have power because people keep giving them money. That is something that is very, very easy to stop doing.

Just stop using Facebook. Failing that, block their ads.
posted by Sys Rq at 12:53 PM on November 15, 2016 [4 favorites]


We are drowning in a sea of bullshit but we were all alone in the middle of nowhere with no idea which direction to run well before the bullshit flood ever came.
posted by srboisvert at 2:21 PM on November 15, 2016 [1 favorite]


Just stop using Facebook. Failing that, block their ads.

If you use just the basic Ad Block Plus, FB have actually tweaked their CSS so that it's hard for ABP to know which thing is an ad. Luckily there's an addon for Firefox called Element Hiding Helper (there's others on Chrome, one that works is ContentBlockHelper). It's simple enough to use: after installation you can click on the ABP icon for the menu and choose "Select an element to hide". Then select the FB ads. However, to get around their sneaky CSS that changes that specific element name everytime FB refreshes your newsfeed/timeline, you can have it block that entire container where all ads sit. What you end up adding as a filter is "facebook.com##.ego_section". That's right, they call their ad container the "ego_section".
posted by numaner at 2:36 PM on November 15, 2016 [5 favorites]


So here's what we do. Create a system that gives every website a veracity score. Google has PageRank, we'll call this TruthRank. The problem everybody's trying to solve right now is affinity : figuring out who will click on this, who will like this, who will share this, who will buy this, etc. This is a hard problem to solve. By comparison, computing a veracity score should be easy. After all, fact checking is something that's existed almost as long as the printed word.

So here's how we do it. Hire a staff of 20-40 people familiar with research and information science. Retired librarians, adjunct professors, unemployed fact checkers from now-defunct media outlets, that sort of thing. These people will be your fact checkers. They will probably work remotely. Select the 1,000 most popular sources from Google News. Have your checkers compute a veracity score for each site. This score will be entirely based on how many provably false statements contained therein. Obviously it won't be possible to examine every page of these sites, but your checkers should be diligent and do a good job. You should hire people from diverse ideological backgrounds, walks of life, geographic locations, ages, ethnicities, political preferences, gender identities, personality types, etc. Be as diverse as possible. A site with many lies gets a very negative veracity score. A site with very few lies would have either a neutral score or very slightly negative score. Note that we are only looking for provably false statements. A grossly offensive conservative screed that contains no provably false statements should get a neutral score. The checkers' judgements and citations will be visible to the public and open for discussion through a community similar to the wikipedia editors' community. Maybe it will take them 6 months to score 1,000 sites. Probably not even that long. These scores will be recomputed periodically.

Now you build your spider. It crawls the web and assigns veracity scores to sites the fact checkers have not examined. When it examines a site, it looks at what sites link to it and what sites it links to. A site that links to many other sites with negative veracity scores gets a negative veracity score. Likewise, a site that is linked to by sites with negative veracity scores gets a negative score. These scores will be recomputed periodically.

I'm not saying it will be a perfect system or impossible to game, but it will be a start. Some combination of algorithms and human fact checking could do the job. I believe this is imminently doable. In fact, I am working on this.

If you are interested, please PM me.
posted by panama joe at 3:19 PM on November 15, 2016 [10 favorites]


Thank you, ZeusHumms! Those look good. Maybe we need a "Citation needed?" emoji, too.
posted by Gin and Broadband at 3:58 PM on November 15, 2016 [2 favorites]


Here's an interview with Melissa Zimdars, the professor who created the google doc:

Meet the Professor Calling Out the Fake and Misleading News Sites Clogging Your Facebook Feed

I am especially glad she is calling out this practice, which seems particularly bad because it (imo) manages to fly under the radar:

The third category I’ve used included websites whose reporting is OK, but their Facebook distribution practices are unrepresentative of actual events because they’re relying on hyperbole for clicks.

This category has caused the most controversy and, well, been taken as offensive to some publications. Upworthy wasn’t happy about its inclusion on this list; neither was ThinkProgress, who I initially included because of their tendency to use clickbait in their Facebook descriptions. A number of websites—both liberal and conservative publications—have contacted me; one even threatened to file “criminal libel” against me, although I don’t think they know what that means.

These websites are especially troubling because people don’t actually read the actual stories — they often just share based on the headline. I had the Huffington Post on my list of 300 potential additions because they published an article on Monday with a headline that claimed Bernie Sanders could replace Donald Trump with a little-known loophole. The article itself was chastising people for sharing the story without actually clicking it, but so many people were sharing it like “oh, there’s a chance!” An effort to teach media literacy ended up circulating information that was extremely misleading.


I mean, holy shit, that Bernie article. I've seen that shared by people in my newsfeed but never clicked on it because I get a ton of media feeds and I kind of inwardly figured that if there was some secret loophole that was legit, I'd hear about it in other places eventually. I don't even remember which of my friends shared it, but I know that a lot of my friends on the right and the left are just mindlessly sharing things they haven't read based on the headline or pullquote alone.

On the one hand, there's trying to draw readers in with your headline and then there's using a headline that is multitudes different than what the article is actually saying. Media sites that do this need to be classed as just as false as straight up partisan, garbage, misinformation sites.
posted by triggerfinger at 4:42 PM on November 15, 2016 [9 favorites]


Its the complacency with being wrong that allows people to even use facebook as a news source. People don't actually want to learn, they want to be confirmed and assured in what they already know.

Remember in the election threads here, how often various people said they couldn't handle most of the Internet during the election run up, emotionally, and so were only keeping tabs via MetaFilter?

It was reassuring, right? Safe. Not challenging.
posted by rokusan at 6:29 PM on November 15, 2016 [10 favorites]


This fake news problem seems to be a specific instance of the perennial urban legend phenom, except that the fake news is deliberately generated for political and for financial gain. (For the latter see this article on where many of the fake news sites are based. It's (surprise!) Macedonian teenagers generating clickbait for fun and profit.) Anyhow, one thing that shows the intractability of fake news is that I once had a friend who used to send me reams of urban legend email. Finally I sent her to snopes.com, but instead of using it, she continued to send me chain-letter email. I learned that some people really can't be bothered with the truth. I'm optimistic that she is in the minority, but in the meantime, it's still somewhat depressing to contemplate.
posted by storybored at 8:51 PM on November 15, 2016 [1 favorite]


Remember in the election threads here, how often various people said they couldn't handle most of the Internet during the election run up, emotionally, and so were only keeping tabs via MetaFilter?

I don't think you can draw a fair equivalence between the desire to avoid honest-to-god hate speech and unhinged conspiracy theories and the desire to live in a closed information bubble that never exposes you to contrary views.
posted by tobascodagama at 9:04 PM on November 15, 2016 [5 favorites]


There really is a sucker born every minute.

Yeah, but that can be a serious problem for a democracy, which is why traditionally we placed so much emphasis on the importance of building and publicly supporting reliable systems for disseminating information in the public interest. It never worked perfectly, but giving up completely isn't the solution to that problem.
posted by saulgoodman at 9:46 PM on November 15, 2016 [4 favorites]


I saw that smug letter from NYT just after I had canceled my sub for the NYTs role in making Clinton seem as bad as Trump. I then subbed to the Post, because it had some great reporting during the election.
posted by persona au gratin at 2:35 AM on November 16, 2016 [2 favorites]


It isn't so much Facebook alone that is causing the issue - it is the convenience which Facebook offers when it comes to sharing that information.

And, in most cases, users aren't even making the decision to share. They are just hitting the "like" or "sad" button or whatever, and Facebook then shares the content to their friends' news feeds.

My news feed became basically unreadable because only a tiny fraction of what I was seeing was actively posted or shared by my friends. (Previously.) When I posted on Facebook to complain about this, many of my friends were confused or surprised--they hadn't realized that their likes and comments were getting published to their friends.

And this annoyance becomes a disaster when you're talking about stories like the ones Zimdars calls out, with misleading headlines and descriptions. Let's say triggerfinger comes across one of those stories and comments on it to say "Hey, heads up everyone, this is bullshit. The actual story is not what the blurb implies." I'm friends with triggerfinger so when I go to my news feed I see "triggerfinger commented on a post." But it doesn't show triggerfinger's comment. It shows the goddamn misleading headline and misleading blurb. So the misinformation propagates even though triggerfinger was trying to correct it.
posted by mama casserole at 6:02 AM on November 16, 2016 [26 favorites]


Lotta moving parts to this discussion, but one of them was identified by a right wing radio personality named Charlie Sykes, who admits that the right wing has created a monster.
posted by swheatie at 2:57 PM on November 16, 2016 [3 favorites]


They ain't gonna ban the NY Times for Judith Miller and the Iraqi WMDs.
They ain't gonna ban Rolling Stone for the "Rape on Campus" hoax.
They ain't gonna ban Newsweek for its "We found the Bitcoin creator!" harassment of a random old man.
They ain't gonna ban anyone exposed colluding with the Clinton campaign and helping elect Trump with the "Pied Piper Strategy."

Fuck 'em. Just salty there isn't a big pricey printing press or broadcast tower they control anymore.
posted by save alive nothing that breatheth at 4:41 PM on November 16, 2016 [1 favorite]




The folks at FBPurity have provided the list along with instructions for using it a a blacklist text filter with their extension.
posted by Johnny Wallflower at 6:09 PM on November 17, 2016 [2 favorites]


Here’s a Chrome Extension That Will Flag Fake-News Sites for You

A friend of mine made something similar. It doesn't censor, just highlight. And it covers "satirical" sites in addition to propaganda outlets, so you don't have to memorise all five billion Onion wannabes out there to avoid getting duped.

My only personal gripe is that I think he tried a bit too hard to shove some lefty sites in there to satisfy a dubious notion of "balance", but I didn't actually spot any sources on there that I'd personally consider to be unjustifiably listed. The good news is that, if you do spot a site that's unfairly labeled or can think of an unreliable site that isn't on the list (Crooks and Liars jumped out as me as a "really? them?" kind of item, but I don't actually read it, so I dunno), you can submit a GitHub Issue with your justification. He's a reasonable guy, in my experience, so posting a polite, detailed issue is probably worth your effort.
posted by tobascodagama at 8:30 PM on November 17, 2016 [2 favorites]


WaPo: Fake news on Facebook is a real problem. These college students came up with a fix in 36 hours.

Chrome extension here. The authors say their verification servers are overloaded so it may not do anything at the moment.
posted by Johnny Wallflower at 10:48 AM on November 18, 2016


We Tracked Down A Fake-News Creator In The Suburbs. Here's What We Learned

That his company is named "Disinfomedia" seemed a little too good to be true, so I did track down that there was a domain dispute between him and Washington Post over washingtonpost.com.co

And the quote that I'm seeing forwarded everywhere is:
We've tried to do similar things to liberals. It just has never worked, it never takes off. You'll get debunked within the first two comments and then the whole thing just kind of fizzles out.
posted by RobotHero at 4:16 PM on November 24, 2016


And the quote that I'm seeing forwarded everywhere is:


Considering that this is a comment made by someone whose sole skill is making garbage viral, I am disinclined to believe it. I see plenty of junk passed around by liberals.
posted by maxsparber at 7:09 AM on November 25, 2016 [3 favorites]


The kind of goofy stuff I see passed around on the left tends, in my opinion, to be lifestyle stuff, health and nutrition. So, lots of stuff about big agriculture, Monsanto, diet and how it might relate to your health and your kids health. Vaccine panic, of course. When you delve in to some of these things, it can be a morass of "studies" mixed with studies, dubious expertise and people trying to sell things. It can be harmful when you are taking in information from dubious sources because I think it undermines your ability to separate facts from shill. And it does inform political opinion. You can look at some of the candidates that managed to emerge on the national stage. It feels less corrosive to me but that's a bias toward my friends and my own upbringing.
posted by amanda at 7:36 AM on November 25, 2016 [3 favorites]


Agreed, amanda. The equivalent unhinged stuff from the left would be "Trump Runs Child Sex Ring From Trump Tower" or "Illegal Immigrant Melania Painted Stars And Stripes In Her Toilet Bowl Because She Likes To Shit On Old Glory."
posted by Johnny Wallflower at 8:05 AM on November 25, 2016 [1 favorite]


There was a story about Trump actually being born in Pakistan.
posted by maxsparber at 8:09 AM on November 25, 2016


This Florida-welfare-drug-test meme flew around liberal circles last year. As soon as I saw it, the eyebrow went up: $178 million dollars? That's a...hefty chunk of a state budget of about 9 billion - somewhere around two percent. Really? Didn't seem possible, and should have raised everyone's eyebrows. Yeah, those numbers were not at all accurate: Tampa Bay Times, taxpayers spent $118,140 to reimburse people for drug test costs, at an average of $35 per screening. The state's net loss? $45,780. And the NYT corroborated that.

This demonstrated to me that the left is not a whole lot better about critical review and fact-checking than the right. Also, if we could all just take one pledge to improve matters, it would simply be: avoid information in the form of memes like the plague. Memes are sticky and compelling and grabby, but they are not good sources of information. As someone said on another thread here, "it's like some people believe you win an election on sick burns and dank memes." They're mostly garbage, even if attributed, because people consume them like pieces of candy and never investigate the fuller context. If we just posted news pieces in the context of their sources, we'd instantly be a lot better off.
posted by Miko at 8:20 AM on November 25, 2016 [2 favorites]


For the record, Donald Trump is older than Pakistan.
posted by Sys Rq at 8:26 AM on November 25, 2016 [2 favorites]


Just verified - even if the Florida program had applied to every single person who applied for benefits in 2012 (163,237) and the state had to pay the maximum of $25 per person to reimburse the test - which they never actually did because it was never fully implemented - the cost would have maxed out at $4,080,925. I'm not great at math but we do need to try to work on numeracy in political thinking as well.
posted by Miko at 8:30 AM on November 25, 2016 [2 favorites]


Is Pakistan a dog-whistle for the Left? I don't really think so. That story seems aimed at gullible Trumpers than anyone on the Left. I think the problem with fake political news aimed at the Left is that the Left tends to say in the face of nationalist stuff, who cares? Or like Limbaugh going on about how repulsive the Left's attitude is around consent. Most of my cohort go, "Uh, yeah. If you consent to a certain sex act, that's fine by me."* The Left is far easier to bait with information on how weed can cure liver cancer and toe warts than appealing to nationalism or sexual purity or "family values" which are keys to the Right's appeal.

I really enjoyed that piece about the fake news creator. However, the bit about it never working on liberals made me laugh. C'mon! That's a fake fact! It may well be that this guy doesn't know how to bait the Left or it may be that the Left is more fractured. This I could believe. It's part of the reason that the Left cannot seem to coalesce behind a single person or movement. But that statement works because we want to believe it is true – the Left is just smarter! Sure, okay. Let's pass that around because it conforms to our worldview regardless of whether it is true or not. It may not even be able to be empirically true. That's the best stuff for fake-fact peddlers – impossible to verify. I wish they had not included the statement in the piece because any Right-leaning person who suffers to listen to it or read it will get to that bit and throw the whole thing out.

*I went right to the internet when this story started popping up to find the audio because it seemed too dog-whistley. I couldn't believe he really said this! Must be a twisted up story. Nope. The audio is even better than I imagined. What a goddamn weirdo they have in their midst.
posted by amanda at 10:03 AM on November 25, 2016 [3 favorites]


Here is an example of a right-wing Facebook group using the trendy new format of viral videos to spread misinformation about the Dakota pipeline protests. How are you suppose to fact-check things like this? This is brand new, too, it'll take time to debunk this and by then it'll have been shared much too widely.
posted by gucci mane at 8:49 AM on November 26, 2016 [1 favorite]




How are you suppose to fact-check things like this?

This isn't that hard. The first, central principle is that you don't take any 'information' at all from sources titled things like "Uncle Sam's Misguided Children" - in their own "about" page, "a community founded by United States Marine Veterans to bring awareness of the lame stream media." That is not a reliable, known, credible source with a journalistic process. Who knows what TF it is, but that, it ain't. You're safe to ignore this crap and please, never share it, and tell anyone who is sharing it to stop reading crap.

Second, it does offer a litany of "facts" and any of those can be properly verified - or not:

1. There are paid protestors who are violent

I haven't found any credible source to verify this. But be very skeptical. Yes, people from all over the world are sending money and trying to fund the protest actions. They need food, warm blankets, medical supplies, winter gear - they need money. And they want more people to join them, and they don't want those people to worry about bringing their own bankroll. Fair enough. That doesn't equate to people "paying" protestors. The notion of paid protestors is a GOP meme from way back, and it's just about always BS. Just like this recent totally made-up attempt to discredit protestors as "paid."
2. The authorities are not attacking the protestors
"Attacking" is a tricky word because it suggests intent. The protestors are taking action to hold ground and dismantle barricades. The authorities are attempting to repel and control them using force. That's not in dispute. Any number of mainstream sources will get you a verifiable report - try search terms like "DAPL violence" or "DAPL authorities" or "DAPL injuries," and look for mainstream media sources.
3. Construction is not taking place on sacred land but on an area north of the tribe
There is basically no "non-sacred land" as far as the water protectors are concerned - but the "area North of the tribe" is only about .5 miles from currently designated tribal lands. Not far. The tribe has only recently been able to survey the pipeline-impacted land, and (OF COURSE) found a number of artifact deposits that would suggest a historical presence deserving of protection. But even though the tribe recognizes that they don't currently have, under the US system, political control of the land impacted by the pipeline, watersheds do not respect political surface boundaries, and they are right to be concerned that threats to the water supply would be felt on federally recognized Souix Nation land. It's really interesting to read about the Army Corps of Engineers and how they permit these projects; the permits basically grant the right to impact up to a half-acre of land surrounding the pipeline in any direction. That certainly could impact the Sioux tribal water supply in the event of a spill. And be very aware that a spill is what the concern is over. We know these pipelines are not fail-safe and we know that not only does constrution disrupt sacred sites and water supplies, but any leak, spill, break, or explosion would have a devastating impact. That's what the concern is. As this [admittedly polemical but fact-based] piece reminds us, "In 2010, a single pipeline spill poured 1,000,000 gallons of toxic bitumen crude oil into the Kalamazoo River in Michigan. The cleanup cost over one billion dollars and significant contamination remains. And in January of 2015, more than 50,000 gallons of Bakken crude oil spilled into the Yellowstone River in Montana. It was the second such spill in that area since 2011."
4. The proposed pipeline will go under the river and will not harm the local water supply.
Riiiiiiight! And it will never leak, break, be dislodged by seismic activity, or decay. It will be JUST AS AWESOME as all of our other infrastructure that never leaks, breaks or decays! Right? Right? So it would never, say, start leaking into the river right above it? The thing is, the NoDAPL protestors are saying that no matter how safe and secure you think your pipeline is going to be, we just shouldn't be trying to tap these sources at such great cost. We should instead be investing in green/renewable energy. Their stance is not solely a tribal-protection or water-protection one, but an effort to address US energy policy broadly and point out its weaknesses and inefficiencies and collateral damage - all on display right here, right now.
5. Six other lines already cross nearby and no one protested them.
Tough to fact-check this in the absence of any specifics about these "six other pipelines." But the DAPL protests certainly have a very long historical lineage and did not come out of nowhere, nor are they a hypocritical exception.

In short, don't throw up your hands and say "how do I fact-check this?" Just get started. Every purported fact can be fact-checked, every assertion examined, and every opinion given context. It is not impossible to bring a fact-based, evidence-based understanding to these events. It does take time, and it takes a degree of effort to begin to discern decent sources from shitty ones, and to actually read the articles and develop a feel for the reliability and direct reporting vs. opinion-based interpretation that you'll see. Note for people who believe this crap that there are no sources cited at all.

It's not like this isn't intellectually demanding work, I get that. But it's essential work and work we must do if we actually want the democracy to function. If you don't feel like you have it yet, keep at it. Start with Google, privilege mainstream news sources that retain a staff and run an accountable journalistic, editorial process, and always wear your skeptical hat.
posted by Miko at 8:27 PM on November 26, 2016 [4 favorites]


« Older 47 Years Longer than the Average Raccoon Lifespan!   |   Ubuntu 12 exploit through NES ROM hacking Newer »


This thread has been archived and is closed to new comments