How Facts Backfire
July 12, 2010 9:24 AM   Subscribe

 
Really interesting. Buried in this article was the most fascinating finding of all:

people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t.

Someday maybe we will wake up and start using positive reinforcement all the time, for everything we say is important to us, including education, and if that day comes, so will positive change.
posted by bearwife at 9:31 AM on July 12, 2010 [9 favorites]


Idle armchair thoughts: Being wrong makes people feel small and out of touch, isolated from the herd. Being right makes one feel connected to society and others, even if it's a small society. It's not so much about being factually right or wrong, but being connected to others.
posted by new brand day at 9:34 AM on July 12, 2010 [5 favorites]


As someone who tries and mostly fails to teach information literacy, this article matches the shape of the crushed remains of my soul almost exactly.

On the other hand, information is something we're still learning about and it's highly likely we're just doin' it wrong.
posted by shinybaum at 9:36 AM on July 12, 2010 [5 favorites]


Nothing is that article is going to change my mind and I don't give a damn.
posted by ChrisHartley at 9:38 AM on July 12, 2010 [13 favorites]


Funnily enough, i've been writing about conspiracy theorists and how they appear constitutionally incapable of recognising a fact, even when it bites them on the arse.
posted by quarsan at 9:49 AM on July 12, 2010


Pretty close to this. Or what was that study I saw on the Blue regarding how the more exploratory toddlers, later in life, were not as likely to be religious, while the anxious toddlers were the other way around?

Fear makes people cling tightly to what they have. Primates all the way.
posted by adipocere at 9:50 AM on July 12, 2010 [6 favorites]


I literally just clicked over from reading this article to find it posted here on mefi.

Reading the article comments is discouraging because people find ways to demonize the ideas in the article as liberal slander.
posted by ropeladder at 9:52 AM on July 12, 2010 [1 favorite]


Facts are lazy and facts are late
Facts all come with points of view
Facts don't do what I want them to
posted by .kobayashi. at 9:52 AM on July 12, 2010 [11 favorites]


I don't find this surprising at all. I've spoken with people who have extreme, conspiracy-ish views, and either they ignore an inconvenient fact or they twist it to make it fit into their preconceived worldview (e.g. "Oh, they must be in on it too!")

Say it with me, people: "Everyone knows reality has a well-known liberal bias."
posted by Bromius at 9:52 AM on July 12, 2010 [3 favorites]


...the more exploratory toddlers, later in life, were not as likely to be religious, while the anxious toddlers were the other way around?

So it isn't enough that kids always try to push their parents buttons but ALSO 2-2.5 of my kids are psychologically primed to become religious? Great.
posted by DU at 9:53 AM on July 12, 2010


I've said it before and I'll say it again: Democracy simply doesn't work.
posted by Atom Eyes at 9:53 AM on July 12, 2010 [2 favorites]


I've said it before and I'll say it again: Democracy simply doesn't work.

I am (almost) literally on my way out the door to teach the kids some Plato, specifically his criticisms of democracy. Thanks for the link, I will mention it in class. And then put his analysis to a vote. :)
posted by joe lisboa at 9:59 AM on July 12, 2010


I'd like to think that exposing people not to more information/disinformation in the news or in politics or in these studies, having everyone get solid practical training in simple critical thinking skills during elementary and secondary education would go a long way. I don't think it would change human nature greatly - people do want to get together in belief groups - but it just makes a person better at evaluating all forms of argument, and also more okay with discarding a hypothesis in the face of countering evidence. When I was in grade school I was lucky enough to take a yearlong class called "Critical TV Viewing," which I now recognize as a pretty clever way to get kids to work on critical thinking skills (we got to watch TV! In school!). We learned about the logical fallacies, reading critically, writing persuasively, constructing arguments, questioning appearances, etc. Very useful, and something that lots of people clearly do not learn. I would recommend this as part of the reformed civics education program America desperately needs but does not have, doesn't even really seem to have interest in having.
posted by Miko at 9:59 AM on July 12, 2010 [20 favorites]


Reading this...

In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs.

is sort of like looking in the mirror at your teeth really carefully for the first time when you're a crystal meth addict who has gotten clean.

It's a really ugly truth that you always knew was there, and at first seeing it actually makes you feel better because it confirms why things have been slightly painful. But the more you think about it, all you can see it rot.
posted by MCMikeNamara at 10:04 AM on July 12, 2010 [1 favorite]


Fear makes people cling tightly to what they have. Primates all the way.

This. Forget positive reinforcement. The first political movement to proceed on the correct factual basis that humans are animals, and thus part of nature, firstly and mostly, will succeed.

Actually, all politics is predicated on this anyway. We just deny it.
posted by fourcheesemac at 10:07 AM on July 12, 2010 [7 favorites]


"facts are stubborn things" - former President Adams

And so it appears that it is not the facts but people's receptivity to those facts that is the stubborn thing.
posted by nofundy at 10:12 AM on July 12, 2010


It isn't just politics, pretty much any piece of information means exactly what people need it to mean. Even harmless stuff like the colour of a pair of shoes or the lyrics to a song. I'm not into the po-mo but I am pretty surprised we manage to function well enough as a group in spite of the fact we agree on so little when we're asked to think about it.

Possibly we only function because we're rarely asked to think about it.
posted by shinybaum at 10:16 AM on July 12, 2010 [1 favorite]


I was taught the same kind of things Miko was, in grade school in Georgia. I think it made me rather more cynical than I'd have been otherwise. Which may explain why I wonder thus:

This article, and others like it, proceed from an assumption that believing true things is the logical, normal default for the vast majority of intelligent, sane people. Given how much we've all seen about how media, facts, the Soviet propaganda machine, and our own minds can be manipulated, why is trusting information "normal"?

I choose to trust the vast majority of information I read, but that's mainly because 1) I figure it's too much trouble to lie credibly; 2) questioning everything thoroughly would take an infinite amount of time, and I'm busy doing other things; 3) I believe I'm educated enough to "smell" when something is wrong. But this isn't rational, it's convenient.

What engenders the amount of trust that *I* feel of the information coming to me? Why would I want to believe NPR more than FOX? I know they have processes, seem more open to questioning themselves, speak with less affect, etc. -- all things that agree with my education-given preconceptions of what is most likely true -- but, philosophically, that is, boiled down to salts and dust, isn't it a bit of a leap of faith to believe *anything* you hear or read? Where does that faith come from, and why should we trust it?
posted by amtho at 10:18 AM on July 12, 2010 [8 favorites]


The first political movement to proceed on the correct factual basis that humans are animals, and thus part of nature, firstly and mostly, will succeed.

"Politics is best discussed on all fours."
(Robert Anton Wilson)
posted by philip-random at 10:20 AM on July 12, 2010 [3 favorites]


Possibly we only function because we're rarely asked to think about it.

I think this is it. It's like how everyone agrees with a statement like "Congress is so stupid."

Person A understands that to mean they are too conservative. Person B understands it to mean they are too liberal. Person C understands it to mean they are too proactive while Person D understands it to mean they are too reactive. Person E understands "Congress" to mean the institutional rules while Person F understands it to mean most or all of the individuals humans. Person G understands it to apply to government in general but Person F means Americans in general. And so forth.

A person of my acquaintance always says he never knows how people understand things that to him are very ambiguous. As a programmer, I'm already on the far end of the general bell curve of ambiguity-avoidance but he's on the far edge of my personal bell curve. It can be very frustrating talking to him about a technical matter because I have to specify every little "obvious" thing. OTOH, when we are done talking we usually know exactly what just happened or what needs to happen.
posted by DU at 10:24 AM on July 12, 2010 [5 favorites]


Steven Colbert got surprisingly close to the heart of the matter when he coined the word truthiness. If it sounds true to me, then it must be a good solid fact.

"Facts are stupid things."

-- Ronald Reagan
posted by Devils Rancher at 10:27 AM on July 12, 2010 [1 favorite]


There are less facts, more that there are forces.
posted by uraniumwilly at 10:32 AM on July 12, 2010


Will the study do a 5 year followup? The fact that people immediately reject new information that conflicts with their assumptions isn't really news. It's just part of the survival instinct. New information is not prioritized over existing beliefs until it seems necessary.

Racists, like the late Senator Byrd, eventually realized that they were wrong. The recipient of two Congressional Medals of Honor, Smedley Butler, eventually became an outspoken pacifist. People can still surprise you, if you let them.

So let me caution anyone letting this article, teetering on top of a single study, convince you that people are unable of progressing under their own will. As evidence, just look at democracies versus anything else. Give people the choice, and eventually they arrive at better conclusions.

A more serious problem is that people are not given complete arguments, just sound bytes, so they cannot be expected to make rational decisions.
posted by atypicalguy at 10:33 AM on July 12, 2010 [8 favorites]


This article, and others like it, proceed from an assumption that believing true things is the logical, normal default for the vast majority of intelligent, sane people.

Just to be clear, the actual research underlying the article doesn't proceed from that assumption. It's either an expansion or special case of the dominant Receive-Accept-Sample model of public opinion. The refusal of people to accept considerations that run against their predispositions/preconceptions has been well-known for a while.

(It also means that people are often very bad Bayesians)
posted by ROU_Xenophobe at 10:33 AM on July 12, 2010 [1 favorite]


This part is a little strange:
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”
This isn't what Festinger meant by cognitive dissonance. In the original study, detailed in When Prophesy Fails, the notable effect was an increase in the intensity of belief when disconfirming facts became unavoidable. So this isn't a new effect, though the use of the term "backfire" is new and alters the way cognitive dissonance is understood. (or at least the way Festinger understood it when he introduced the term.)

So I'm not sure what's new here. It would be better to see the original studies if they are available.

That being said, you can't knock a sucker or smarten up a chump.

In a fraud case I worked on, several thousand victims refused to believe they had been duped and refused to take their money back from the court-appointed receiver because the con man had convinced them all it was a government plot to rob them of their 'profits.' Not one victim requested the return of their money. Ya couldn't knock 'em.

The self-inflicted paranoia shown by most conspiratorialists is part and parcel of their "don't confuse me with the facts" stance. Be afraid, be very afraid.
posted by warbaby at 10:33 AM on July 12, 2010 [5 favorites]


The article doesn't talk much about cases where the information presented as factual is actually false.
posted by XMLicious at 10:46 AM on July 12, 2010


Here's the original paper, When Corrections Fail and Nyhan's blog post about the Boston article
posted by warbaby at 10:49 AM on July 12, 2010


In other news, you can't educate pork.

You can't polish a turd, either. But you can roll it in glitter.
posted by MuffinMan at 10:49 AM on July 12, 2010 [2 favorites]


It's possible to read to much into gutchecks and kneejerks. There was a piece in Psychology Today a whiles back--"The Ideological Animal," December 2006, yoiks; according to my notes, it was in the wake of that silly season's "Crazy Conservatives" meme--anyway, lemme grab the two quotes that hit me. First is in keeping with what we've got here:
As a follow-up, Solomon primed one group of subjects to think about death, a state of mind called "mortality salience." A second group was primed to think about 9/11. And a third was induced to think about pain—something unpleasant but non-deadly. When people were in a benign state of mind, they tended to oppose Bush and his policies in Iraq. But after thinking about either death or 9/11, they tended to favor him. Such findings were further corroborated by Cornell sociologist Robert Willer, who found that whenever the color-coded terror alert level was raised, support for Bush increased significantly, not only on domestic security but also in unrelated domains, such as the economy.

University of Arizona psychologist Jeff Greenberg argues that some ideological shifts can be explained by terror management theory (TMT), which holds that heightened fear of death motivates people to defend their world views. TMT predicts that images like the destruction of the World Trade Center should make liberals more liberal and conservatives more conservative. "In the United States, political conservatism does seem to be the preferred ideology when people are feeling insecure," concedes Greenberg. "But in China or another communist country, reminding people of their own mortality would lead them to cling more tightly to communism."
So fear makes you stupid and democracy doesn't work because we cling ever more tightly to beliefs that are demonstrably wrong. But! The punchline comes a little later:
If we are so suggestible that thoughts of death make us uncomfortable defaming the American flag and cause us to sit farther away from foreigners, is there any way we can overcome our easily manipulated fears and become the informed and rational thinkers democracy demands?

To test this, Solomon and his colleagues prompted two groups to think about death and then give opinions about a pro-American author and an anti-American one. As expected, the group that thought about death was more pro-American than the other. But the second time, one group was asked to make gut-level decisions about the two authors, while the other group was asked to consider carefully and be as rational as possible. The results were astonishing. In the rational group, the effects of mortality salience were entirely eliminated. Asking people to be rational was enough to neutralize the effects of reminders of death. Preliminary research shows that reminding people that as human beings, the things we have in common eclipse our differences—what psychologists call a "common humanity prime"—has the same effect.
Ask us to consider carefully. Remind us of the things we have in common.

That's all we have to do.
posted by kipmanley at 10:52 AM on July 12, 2010 [23 favorites]


You can't polish a turd, either. But you can roll it in glitter.

Umm, eee-ewww! Also, I'm a permanent "no" on signups for future decorating projects. No additional factual information will change my mind.
posted by bearwife at 10:55 AM on July 12, 2010 [1 favorite]


A more serious problem is that people are not given complete arguments, just sound bytes, so they cannot be expected to make rational decisions.

This can be very important, as well. Of late I've been despairing at listening to news sources that are generally thought of as better ones as they cover current concerns, and realizing that what they are covering are not problems and issues, but politics. "Will the BP spill impact trust in Obama? Will the Republican Party restructure next year? What will happen in the midterms? Immigration reform - who's for, who's against?" That's all a means - not an end. I would be better served by news discussions that focus on the problem - immigration policy isn't working, for instance - and report on the facts surrounding the problem, the various proposed solutions, their merits and demerits, where similar solutions have worked or failed, etc. We are forgetting the problems we are trying to collectively solved, and focusing on politics, individuals and parties, and ideology instead.

I often feel there's a dearth of information within the news, that the focus is in the wrong place - on the means rather than the ends. I'd be better at tracking what's going on in Congress, and talking to my representatives and all that, if I actually had a better grasp of the problems themselves.
posted by Miko at 10:55 AM on July 12, 2010 [38 favorites]




I get into more than my fair share of political discussions - I like the back-and-forth of challenging other people's preconceptions, as well as having mine challenged. ( I also like stirring the shit-pot, but that's another story.)

There are rules to the game if you really want to change other people's views. First, last, and foremost: You have to be truly interested in where the other person is coming from and emphasize that any disagreement you have with them is NOT PERSONAL. I often say that I respect other people with opinions that differ from mine waaaaay more than I respect people with no opinion at all. Most people will have a real dialogue with you if you can credibly convince them that you are listening.

Second, most people don't take their political opinions past first base; they simply have not been inclined to envision any logical conclusions to what they believe. Here it's easier to get them to (maybe) convince themselves that they may be wrong than it is to force them to listen to your point of view. I find "I understand your point there, but what if ..." to be a very powerful tool with a lot of people.

Maybe it plays to their conceits, maybe it lowers their guard, or maybe it just makes them feel that their viewpoints matter. Plant the seeds, and let them grow 'em.
posted by Benny Andajetz at 11:07 AM on July 12, 2010 [5 favorites]


posted by Miko at 10:55 AM on July 12 [4 favorites -] Favorite added! [!]

I wish I could favorite this multiple times.
posted by Mental Wimp at 11:16 AM on July 12, 2010


A Mule: No sorry, Kevin Bacon wasn’t in Footloose.

Guy: What!?, of course he was.

A Mule: No he wasn’t, you lose.

Guy: Of course he was, he was the star.

A Mule: Nope, you’re wrong. Look it up.

Guy: I don’t have to look it up, it’s common knowledge…

A Mule: Nope..

Guy: he was on the cover…

A Mule: Nope…

Guy: of People Magazine

A Mule: Nope..

Guy: when the movie…

A Mule: No…

Guy: Everyone knows...

A Mule: No

Guy: that….

A Mule: No!..

Guy: Kevin Bacon..

A Mule: NO!

Guy: was the star…

A Mule: NO!

Guy: in Footloose..

A Mule: NO!

Guy: It was a huge movie,…

A Mule: NO!

Guy: he was the lead.

A Mule: NO! NO! NO! NO! NO! NO! HeeHaw! HeeHaw! HeeHaw!
posted by Bathtub Bobsled at 11:24 AM on July 12, 2010 [4 favorites]


A more serious problem is that people are not given complete arguments, just sound bytes, so they cannot be expected to make rational decisions.

Another vote for this as a very important factor.

We're in the "Information Age", and in many respects there's too much information (and still, many truths are deliberately kept from us). So, sheer volume makes it hard to know if the information you get is complete or accurate.

Also it seems more common these days that arguments are fought, not by presenting and and defending ideas, but by attacking and devaluing the opponent, and the source of their 'facts'. Net result is that there's less trust overall, and too high a threshold for new or contrary facts or ideas to push through mistrust to make an impression.

There's also leadership. If our top leaders are perceived as not telling the truth, withholding information, or outright lying, then truth is shown to be an unreachable, impractical ideal, and it's more accepted and expected that everyone is fudging the facts.
posted by Artful Codger at 11:35 AM on July 12, 2010 [1 favorite]


William James, "What Pragmatism Means" (1904):
The observable process which Schiller and Dewey particularly singled out for generalization is the familiar one by which any individual settles into new opinions. The process here is always the same. The individual has a stock of old opinions already, but he meets a new experience that puts them to a strain. Somebody contradicts them; or in a reflective moment he discovers that they contradict each other; or he hears of facts with which they are incompatible; or desires arise in him which they cease to satisfy. The result is an inward trouble to which his mind till then had been a stranger, and from which he seeks to escape by modifying his previous mass of opinions. He saves as much of it as he can, for in this matter of belief we are all extreme conservatives. So he tries to change first this opinion, and then that (for they resist change very variously), until at last some new idea comes up which he can graft upon the ancient stock with a minimum of disturbance of the latter, some idea that mediates between the stock and the new experience and runs them into one another most felicitously and expediently.

... Loyalty to [existing beliefs] is the first principle--in most cases it is the only principle; for by far the most usual way of handling phenomena so novel that they would make for a serious rearrangement of our preconceptions is to ignore them altogether, or to abuse those who bear witness for them.
Atom Eyes: Democracy simply doesn't work.

But this tendency to filter out information--an example of our limited rationality--is universal. There's no guarantee that an autocratic government wouldn't suffer from the same kind of blinders.

I prefer Tocqueville's analysis in Democracy in America, in which he says that democracy is really good at some things, and really bad at others.
We must first understand what is wanted of society and its government. Do you wish to give a certain elevation to the human mind and teach it to regard the things of this world with generous feelings, to inspire men with a scorn of mere temporal advantages, to form and nourish strong convictions and keep alive the spirit of honorable devotedness? Is it your object to refine the habits, embellish the manners, and cultivate the arts, to promote the love of poetry, beauty, and glory? Would you constitute a people fitted to act powerfully upon all other nations, and prepared for those high enterprises which, whatever be their results, will leave a name forever famous in history? If you believe such to be the principal object of society, avoid the government of the democracy, for it would not lead you with certainty to the goal.

But if you hold it expedient to divert the moral and intellectual activity of man to the production of comfort and the promotion of general well-being; if a clear understanding be more profitable to man than genius; if your object is not to stimulate the virtues of heroism, but the habits of peace; if you had rather witness vices than crimes, and are content to meet with fewer noble deeds, provided offenses be diminished in the same proportion; if, instead of living in the midst of a brilliant society, you are contented to have prosperity around you; if, in short, you are of the opinion that the principal object of a government is not to confer the greatest possible power and glory upon the body of the nation, but to ensure the greatest enjoyment and to avoid the most misery to each of the individuals who compose it--if such be your desire, then equalize the conditions of men and establish democratic institutions.
posted by russilwvong at 11:54 AM on July 12, 2010 [10 favorites]


A Mule: NO! NO! NO! NO! NO! NO! HeeHaw! HeeHaw! HeeHaw!

Hey, you've been talking to my friend Walter.
posted by philip-random at 12:00 PM on July 12, 2010


It was Citizen Kane, It was Citizen Kane!

Apart from the points made in the article (which is pretty much an analysis of Stephen Colbert's whole "Truthiness" schtick) what's really depressing is that U.S. media outlets are legally allowed to make stuff up. So, now there's the perfect feedback loop of severely lopsided or outright false news stories that pander to people who are inclined to think with their guts anyway... and when you try and talk facts with those people, the debate begins and ends with "But I heard on Fox News..."
posted by usonian at 12:01 PM on July 12, 2010 [4 favorites]


Between this and humans not having free will I think I'm just a few posts away from donning a labcoat and leading my army of killbots to crush all dissidence.
posted by codacorolla at 12:25 PM on July 12, 2010 [1 favorite]


I prefer Tocqueville's analysis in Democracy in America, in which he says that democracy is really good at some things, and really bad at others.

I doubt Tocqueville would be even that generous if he saw America today, since our nation pursues neither a love of poetry, beauty, and glory nor comfort and well-being for all. In fact, most Americans seem equally offended by both sets of ideals.
posted by vorfeed at 12:40 PM on July 12, 2010 [2 favorites]


Related post.
posted by homunculus at 12:46 PM on July 12, 2010


Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

In all fairness, that says more about the credibility of the Comic Sans font than it does about the readers.
posted by StickyCarpet at 12:48 PM on July 12, 2010


(that's right, it's a font, not a typeface, and nothing you say can change that)
posted by StickyCarpet at 12:49 PM on July 12, 2010


I was going to come in to tear apart the article for missing the point entirely, but Miko got there first, hats off to you sir. The only reason the current media culture has not totally collapsed in favour of internet-based journalism is because the education system is failing so completely that a significant portion of citizens do not have basic information literacy skills, and therefore do not understand why political reporting is near-universally terrible.
posted by mek at 1:27 PM on July 12, 2010


From article:
It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds.

This is part of the human brain's error correction system. As people grow up, they form a worldview based on what they're taught, what they see and generally all the experiences they have. If someone encounters a fact that doesn't fit in with that worldview, they can either challenge their worldview or the fact. Often, for entrenched beliefs, it's the fact that gets challenged.

This is not broken in and of itself. Sometimes one doesn't observe correctly the first time, or hears something wrong. And there are lots of liars out there, spuriously creating false facts, and at times they look real. If you didn't challenge the facts sometimes you'd fall victim to the first faked poll you found. You'd believe everything you see on TV.

The problem lies when you're intellectually incurious enough that you figure that you've had enough experience to form an unassailable worldview, and challenge everything you see that doesn't jive with it. And groups of like-minded people reinforce each others' beliefs, lending them false certainty. You'd be amazed at the number of idiotic anecdotes about people, places, and countries that are floating out there in the unenlightened sectors of the country; these stories are the quantum foam of our political discourse, carriers of political charge from one person to another. Unfortunately, the energy of political thought is not conserved; it is self-reinforcing.

The greatest danger Fox News presents is that it's a massive validating blanket, insulating its viewers from a world that increasingly doesn't resemble their picture of it. In the long run, this leads to irrelevance and stagnation, and that's the best case. Reality doesn't have much patience for those who deny it. If they're (we're) lucky, these people will eventually find that their worldview differs so much from their conception of it that it'll crack. If they're not, then they'll continue to make decisions based on their increasingly incorrect model, and eventually something truly horrible will happen because of it. Iraq and Afghanistan are only the beginning.

The solution is to show the hidebound people a wider view of the world, not just through directly contradicting facts but by showing them a variety of things that don't directly contradict their worldview. Show them that hey, it really is a bigger world than they think. Folk in the third world aren't really so different than those in the U.S., they just have a different environment. The United States is not a lone beacon in a dark world; you'd be surprised how many people think Europe must be a horrible place to live, and I'm not even talking about Eastern Europe. (After all, isn't public-supported health care everywhere over there? They must be savages.)

Belief in the existence of the Other is at the core of conservatism; recognition of the universality of human experience is the foundation of liberalism. Any real progress to be made at resolving our cultural impasse will have to be done with respect to these bases, and information is definitely on our side. You're not going to change their opinion by just telling them they're wrong; instead, show them the folly of their beliefs by showing them the amazing breadth of human experience. It is a potent weapon, and ultimately, it's the only way.
posted by JHarris at 1:29 PM on July 12, 2010 [8 favorites]


This is why, when presented with new data that conflicts with the old, I latch on to the new data as if it were gospel. I roll it around in my mouth like a fine brandy and I savor it. Then, when I'm ready, I readmit the old data bit by bit to see how it meshes with this fresh new data. I keep what makes sense and dismiss whatever conflicts, no matter how vital. In this way, I maintain a consistent worldview at all times.

Internally consistent, anyway.
posted by Eideteker at 1:53 PM on July 12, 2010 [1 favorite]


Speaking of ignoring facts and the consequences of doing so: Vaccination Rate Lags As an Epidemic Spreads
posted by homunculus at 1:53 PM on July 12, 2010


I don't know the exact moment that it happened, but I know the event that convinced me I was mostly grown up: I stopped needed to be right about everything, and embraced the idea that it's ok to be wrong.

Hell, I like being wrong. It shows that what I think I know doesn't count for shit, and that helps me to both keep my ego in check, and makes me check the facts before I make some grand assertion.

And since I deeply enjoy making grand assertions, I'm constantly fact checking to keep just slightly ahead of what's coming out of my mouth.
posted by quin at 2:14 PM on July 12, 2010 [2 favorites]


Dammit, usonian bead me to the Kids in the Hall sketch that immediately came to mind.
posted by joe lisboa at 2:24 PM on July 12, 2010


Belief in the existence of the Other is at the core of conservatism; recognition of the universality of human experience is the foundation of liberalism. Any real progress to be made at resolving our cultural impasse will have to be done with respect to these bases, and information is definitely on our side. You're not going to change their opinion by just telling them they're wrong; instead, show them the folly of their beliefs by showing them the amazing breadth of human experience. It is a potent weapon, and ultimately, it's the only way.

I think any reasonable attempt to recognize the universality of human experience must also recognize that some parts of it aren't universal. Our "cultural impasse" stems directly from the amazing breadth of human experience -- there are cultures and peoples who do not and will not do things our way, and there are cultures and peoples (us, that is) who do not and will not do things their way. Trying to get rid of the Other by teaching people about other cultures lasts about as long as it takes for them to run into something which is simply not compatible with the culture of Us, and that goes for the liberal Us as well as the conservative Us.

If "reality doesn't have much patience for those who deny it", then the first thing we need to accept is that liberalism itself is a product of a certain culture and a certain environment, and that its assumptions can be just as falsely-validating and self-insulating as conservative assumptions are.

In short: stating that "belief in the existence of the Other is at the core of conservatism" and then following it up with things like "information is definitely on our side" and "the folly of their beliefs" would be funny if it weren't so damn common.
posted by vorfeed at 2:36 PM on July 12, 2010 [3 favorites]


vorfeed: you are wrong, and I'll prove it. See? Now you're trapped. I've presented you with evidence and you denying it will only prove the original article's point!
Well, okay. It doesn't prove anything, but it's interesting in the context. I certainly agree that liberalism— or any self-identified cultural construct— can delusionally restrict itself with self-validation.

I think Miko's point is probably the most relevant; critical thinking is what separates us from computers. We can pack in data (or not), but being able to do something with it is what's important. I found this part interesting:
A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are “the very folks on whom democratic theory relies most heavily.”

Perhaps that's a product of discrepancies between education and critical thinking— or maybe it's just human nature.
posted by Red Loop at 3:01 PM on July 12, 2010


I am used to being wrong. I am wrong every day of my life. It's not because I'm better than the people in the article; it's because I'm a programmer. Here's a link to another programmer who says the same thing: "I no longer equate thinking I'm right about something with actually being right about it."

Many people have said kids should learn how to program. They usually bring up how this helps them think logically. They don't point out that it also helps them fail over and over and over. The failure is unambiguous, and you can't argue with it. Your program doesn't compile. And that's clearly because you were wrong about something. No matter how sure you were that you were right, you were wrong. Day after day after day.

But the thing is, the goals are usually simple and achievable, so you get it right in the end. So it's not soul-crushing failure.

And when I fail, it's not like, "Ho-hum. I failed again. Whatever." Often, my whole world turns upside-down, at least briefly. "I KNOW I did that right! There MUST be something wrong with the compiler!" But, no, I'm wrong again, today. Just like I was yesterday. Just like I will be tomorrow, even though I will think I'm right.

And it never gets lets up, because as you master one programming problem, the next one is harder.

I do a lot of teaching too, and the same thing is true in that profession. But it's too easy to justify: the student is too stupid, he's not listening, how can I be expected to teach with these materials? Etc. You can't justify away a program that doesn't compile.

I don't think kids should necessarily learn to program, but I do think that it would be great if they experienced some discipline like this -- for several years.

When I discuss my job with non-programmers, it's clear that most of them don't experience this. They don't experience it at work; and they don't experience it in the personal lives. (Personal-life failures tend to be catastrophic, and that just leads to denial. and justification.) What things do you fail at every day? What daily things are you sure you're right about -- but you're not? It's just not a common experience. Too bad.
posted by grumblebee at 5:25 PM on July 12, 2010 [17 favorites]


Several commenters have mentioned that emotions play a huge role in our perception of political facts. But I think something we (and the study) are overlooking is the fact that a large portion of our political ideas and conceptions are not "facts" in any verifiable sense. Partly because of things like concision, partly because we are social creatures with emotions, partly because politics and policy are inexact social sciences, many of the political influences on voters have nothing to do with "fact" and are instead based on generalizations and oversimplifications.

This comes to the forefront when you read the comments section of any mainstream media story, or any site that draws commenters from a broad range of ideological planes. Nobody is citing statistics or historical facts; they are arguing their points from worldviews based on stereotypes and shaky conceptions of hugely varied and complex groups or systems.

Importantly (and to the detriment of many a comment thread), the worldviews rarely overlap enough to permit coherent discussion, exchange of ideas, and understanding. Instead you get people saying liberals are conspiring to set up an all-powerful government that siphons off wealth from hard workers and gives it to lazy black people, and you get people saying conservatives are stupid.

Ok so maybe these claims are objectively verifiable. My point, though, is that when actual political discussion happens, facts and statistics, no matter how basic, factor in very little. They are overwhelmed by the frames people fit the discussion into. Which I guess is basically what the article and many comments here are getting at.
posted by ropeladder at 10:01 PM on July 12, 2010


ropeladder, you make a good point, but the people who comment on news stories ARE NOT a good sample set for the general population. They range from genuine, to political 'hobbyists" to crackpots to strategic shit-disturbers. Google astroturfing.

Also, most news stories are about something concrete or specific - an occurrence, an issue, a policy, a person. So the comments are potshots or reactions to the story subject, coming from different ideological angles.

Skimming the comments to a news story can give you a precis of the various positions on an issue, but it's dangerous to try to infer a consensus from them.
posted by Artful Codger at 5:47 AM on July 13, 2010


What really interested me in this article is not that the unwashed, mullet-wearing mob just simply couldn't accept the rule of clear, dispassionate and pure reason, but that there is no such thing as the rule of clear, dispassionate and pure reason. The implications are pretty big - even reason's most dedicated partisans can't escape their own cognitive filters. Facts sometimes are troublesome things for them, too.

It seems to me that, since 9/11, there's been a revival in some intellectual quarters of some old, pre-Auschwitz and Hiroshima ideas of the supremacy of Reason and the Enlightenment, and even a revival of a variant of Western civilization's mission to bring light to the darkness of other cultures, minus the Bible-thumping missionaries, of course. In a lot of ways, one god is being swapped out for another - Yahweh for the Great God Progress, who's just as jealous as Yahweh in His worst moods. Just as one example, I've been struck by the depth of animosity from some towards the very notion of a philosophy of science or "science studies" - almost like the response of some Christian fundamentalists to, for example historical-critical methods of Biblical scholarship. But what if Reason isn't - can't be - pure?

Marcelo Gleiser, the Brazilian physicist, has lately been pushing the idea that there is no Theory of Everything, and that the very quest for it is a shadow of monotheism's belief in the One. I'm not willing to go that far, but it does seem interesting that the quest for the One True Way keeps popping up in completely secular contexts. It's fun to wonder what studies that hint that our brains may not be able to see a One True Way in an objective universe - studies like this one - may mean.
posted by jhandey at 7:59 AM on July 13, 2010 [1 favorite]


It's fun to wonder what studies that hint that our brains may not be able to see a One True Way in an objective universe - studies like this one - may mean.

"Reality itself is a thinking thing, and capable of thinking about itself." - Parmenides
posted by Mental Wimp at 9:54 AM on July 13, 2010


Er, "Reality itself is a thinking thing, and the object of its own thought." - Parmenides
posted by Mental Wimp at 9:55 AM on July 13, 2010


there is no such thing as the rule of clear, dispassionate and pure reason.

I am not arguing with this, except to say that I'm not sure exactly what it means. I can think of several possibilities:

1. Reason is a really useful tool, but unfortunately no one is able to use it consistently.

2. Reason is a really useful tool, but unfortunately MOST people are unable to use it consistently.

3. Reason isn't a useful tool for making predictions and testing/making truth claims about an apparent external (to our minds) universe.

4. Reason may be a useful tool for interfacing with a fictional universe, but it doesn't tell us that this universe actually exists or exists in the form we think it does.

5. Reason isn't a useful tool for anything.

6. That thing we call reason (roughly, a bunch of thought-systems invented by Aristotle and others) doesn't actually exist. E.g. you think that someone created a system called "logic," but he didn't. You imagined it.

Each of these has important but very different ramifications.
posted by grumblebee at 12:05 PM on July 13, 2010


The first political movement to proceed on the correct factual basis that humans are animals, and thus part of nature, firstly and mostly, will succeed.

But dear Niccolo never did succeed in getting the Medici to reunite the land of Italy.
posted by Diablevert at 7:43 PM on July 13, 2010




« Older RIP Harvey Pekar   |   Take Me Out to the Ballgame Newer »


This thread has been archived and is closed to new comments