Friends, Romans, countrymen, lend me your ears....
April 18, 2011 3:50 PM   Subscribe

The Science of Why We Don't Believe In Science. Or How To Win An Argument: Try Not To Rely on Facts. MotherJones investigates what recent research can tell us about how we reason (or don't).
posted by Diablevert (45 comments total) 75 users marked this as a favorite
 
Or, why you're wasting your time with your [step-father|facebook friends|coworkers]
posted by DigDoug at 4:03 PM on April 18, 2011 [2 favorites]


Nice post!
posted by Leisure_Muffin at 4:05 PM on April 18, 2011


Now I question everything I believe in, but I'm still fairly certain you are wrong.
posted by cjorgensen at 4:10 PM on April 18, 2011 [3 favorites]


I thought this article was bullshit.

HAMBURGER
posted by rtha at 4:13 PM on April 18, 2011


Hitting idiots in the head with a piece of wood , dropping them from a fifth storey window or setting them on fire tends to convince them about the efficacy of physics, biology and chemistry, if only briefly.
posted by obiwanwasabi at 4:25 PM on April 18, 2011 [6 favorites]


Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

I have trouble with the idea that evolution played a big part in this. After all, the person who embraced the unappealing thought "I bet there is a lion behind that rock" survived while the person who pushed the thought away didn't.

Otherwise, it seemed pretty direct: believers gonna believe.
posted by GenjiandProust at 4:25 PM on April 18, 2011


After all, the person who embraced the unappealing thought "I bet there is a lion behind that rock" survived while the person who pushed the thought away didn't.

The person who embraced the thought "I bet there is a lion behind EVERY rock" surived too, despite most rocks not having lions behind them.
posted by Justinian at 4:33 PM on April 18, 2011 [9 favorites]


GenjiandProust, I agree on that particular analogy being a flawed one. I suppose it's the "push threatening information away" line which bothers me. While fleeing a physical threat makes sense, the information-equivalent of completely ignoring information makes less sense and isn't common. Instead, you try to wedge it into your worldview somehow, if only by explaining it as a lie/trick/mistake/misunderstanding.
posted by luftmensch at 4:42 PM on April 18, 2011 [1 favorite]


One of my history profs used to say that minds changed one grave at a time.
posted by benzenedream at 4:49 PM on April 18, 2011 [29 favorites]


Regarding their usage of the neologism "narrowcast", we could just use the word "cultivate". It comes from the same domain as the broadcast metaphor, and carries some nice added meanings as well.
posted by idiopath at 4:59 PM on April 18, 2011 [1 favorite]


The person who embraced the thought "I bet there is a lion behind EVERY rock" surived too, despite most rocks not having lions behind them.

Well, not if that kept him/her from eating/sleeping/the other business of life. The person who was able to evaluate which rocks were likely to have lions behind them ended up with the best options. Or so I would imagine. There is a lot I do not understand about evolutionary psychology....
posted by GenjiandProust at 5:01 PM on April 18, 2011


Read A Darwinian Left: Politics, Evolution and Cooperation by Peter Singer.
Yes, he's the animal rights philosopher at Princeton but this ain't pushing vegetarianism.
posted by jeffburdges at 5:06 PM on April 18, 2011 [1 favorite]


You lead with the values—so as to give the facts a fighting chance.

Wise advice.
posted by Go Banana at 5:12 PM on April 18, 2011 [3 favorites]


Polemics from people who want desperately to be right, which will validate their views of themselves and against their chosen enemies. I am tired of anyone who wants to describe people as though the entire world fits neatly into their chosen opposites. Or as the person said above... I think this article is bullshit.
posted by midnightscout at 5:13 PM on April 18, 2011


"Or, why you're wasting your time with your [step-father|facebook friends|coworkers]"

Too damn true. I got a Facebook friend with two unvaccinated little girls who just moved straight into Oregon's whooping cough zone. I'm trying to steer her round with little grains of doubt here and there.
posted by ocschwar at 5:36 PM on April 18, 2011


I am tired of anyone who wants to describe people as though the entire world fits neatly into their chosen opposites.

Hey, me too.

Polemics from people who want desperately to be right, which will validate their views of themselves and against their chosen enemies.

You are tiresome.
posted by LogicalDash at 5:51 PM on April 18, 2011 [2 favorites]


I was really feeling this article until I got to the sentence that began, "Evolution required us to react...."

And then I realized how, in more ways than one, the author was putting the cart before the horse. Evolution doesn't "require" anything of us; amusingly enough, one of the biggest reasons cited by evolution-deniers hinges on a misunderstanding of evolution that positions the process as a force which causes plant or animal life to adapt to environmental conditions. They say, "See, if evolution were real, I'd be able to run as fast as a car because evolution would know that's better." But evolution happens only because some random changes happen work out a little better than other random changes - it doesn't mean those changes are "good" or "better", just that the result was that death happened after procreation.

Similarly, I suspect whether we push away or pull in information has more to do with the human propensity for pattern seeking in general, and less to do with actively seeking out that which we, in retrospect, classify as "positive" or "negative". After all, "reasoning" is as much pattern-finding as "rationalizing", it's just that, culturally speaking, we've decided reason is synonymous with truth.
posted by lesli212 at 5:56 PM on April 18, 2011 [2 favorites]


Well, the narrow version of "broadcast" would be something like "inseminate", which might be more specific than we'd like, and carry some other baggage. On the other hand, I haven't rtfa.
posted by sneebler at 5:56 PM on April 18, 2011 [2 favorites]


MetaFilter: We may think we're being scientists, but we're actually being lawyers.
posted by rdone at 5:57 PM on April 18, 2011 [9 favorites]


sneebler: "Well, the narrow version of "broadcast" would be something like "inseminate""

And that ties in nicely with a common derisive term for "narrowcast" media: circlejerk. It all works out very nicely doesn't it, instead of seeds being spread widely from a central location, we get "seed" being spread mutually into a central location.
posted by idiopath at 6:09 PM on April 18, 2011 [1 favorite]


I don't believe this sci-oh am I too late? I'll come in again.
posted by rusty at 7:23 PM on April 18, 2011 [1 favorite]


As a sidebar to this story, it does strike me as darkly amusing that in some ways science is only just now coming round to certain implacable truths about human nature that art has been banging the drum about ever since Gilgamesh went on a bender. The funeral speeches of Brutus and Marc Antony in Shakespeare are the paradigmatic examples of the effectiveness of an appeal to reason verses an appeal to emotion. Though of course it's true, too, that the Roman mob was biased in Antony's favor given Ceaser's glamour and popularity, and thus the effects discussed in the Mother Jones piece --- being critical about arguments we dislike with and finding reasons to back up arguments which please us would kick in as well.
posted by Diablevert at 7:31 PM on April 18, 2011


Science is hardly ever "on time" for any parties. It's a set of experimental methods optimized for reliability, not speed; if you want speed, you go into engineering, where you'll learn lots of tricks that seem to work although we have only the vaguest guesses as to why.

In the case of Shakespeare, assume I am talking about social engineering.
posted by LogicalDash at 7:35 PM on April 18, 2011


In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there's something to climate change,' that's going to mark them out as a certain kind of person, and their life is going to go less well."

Maybe I'm crazy, but I tend to pay a little more attention to the arguments of sheeplers for exactly this reason. People who demonstrate that their ideas are more important to them than social approval is may be wrong frequently, but at least they're working in the absence of one powerful bias.
posted by nathan v at 8:02 PM on April 18, 2011 [1 favorite]


Sheeplers have plenty of social support. Universal paranoia is the building block of many conservative communities; people who use the word "sheeple" are just a bit more up-front than usual.
posted by LogicalDash at 8:20 PM on April 18, 2011


"All of you people that I am engaging with and risking social sanction from have your heads in the sand!"

beats

"Everyone except for us is willfully blind!"

I hear the latter with a lot more frequency than the former. Not a surprise, I guess.
posted by nathan v at 8:28 PM on April 18, 2011


So, people are error-prone and think emotionally. That's a new insight? "A key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion". Perhaps some scientists between 1770-1790 and 1930-1965 might be surprised by this, but even then, I doubt it. People are reluctant to change their mind, and come up with counter-arguments, when confronted with a fragment of new "information" presented by a purported expert. And that's a terrible thing?

I fail to see what is either new or especially worrisome about any of this. Obviously there are people (like cult members) who have incredibly wrong ideas and are very resistant to efforts to change them. We knew this. But those people are not representative, and it is silly to think that everyone works this way. We're all error-prone and emotional, but we knew that. Yet many people still manage to get righter when presented with the right info, enough time, and good people to talk it through with. As long as there are pathways to getting righter (eg, education, science, critical thinking, debate), why must it be presented as an insurmountable problem to overcome emotion and bias? And if it's not a super-big problem we all share, what is new in all of this? What teacher on earth needs a dozen experiments showing that people are often reluctant to learn and rationalize their existing beliefs? And what teacher doesn't also know that most people can overcome this if given the right environment?

There seems to be this implicit suggestion in all of this that conservatives and liberals, the educated and the uneducated, the critical thinker and the cult believer, are all just sides of the same emotional, irrational coin. But they aren't. As the article itself occasionally glimpses, there are better and worse ways of dealing with error-prone brains -- ways we've all been working on for centuries if not millennia.
posted by chortly at 9:04 PM on April 18, 2011 [3 favorites]


You cant reason someone out of a decision or opinion they didn't reason themselves into.
posted by SirOmega at 9:19 PM on April 18, 2011 [5 favorites]


Obviously there are people (like cult members) who have incredibly wrong ideas and are very resistant to efforts to change them. We knew this. But those people are not representative, and it is silly to think that everyone works this way.

Oh, I don't think that's true at all. Take cult members, for instance --- most of the research I'm aware of suggests that while there are common conditions that make people susceptible to being sucked into cults, there aren't necessarily common traits among people who join cults. You can't test for gullibility and mark out all the people who fall below a certain threshold as cult-susceptible. Instead what you can do is say there are certain moments in life when people are much more likely to be vulnerable to joining a cult --- basically, moments when their existing social support network has been taken away from them (just went to college, moved across the country, lost their job) and they're lonely and really happy to hook up with some people who care about them.

Besides, most of what the new research is suggesting is that we're not, as individuals, very good at overcoming our own biases at all --- we feel a lot quicker than we think, and the think we do do tends to be shoved along in the direction our feelings lead us. One might argue that science itself is a work-around to confront this problem and confer a group immunity to bad ideas which we as individuals are not capable of resisting.

But one sees these tendencies even within science, I'd say. Half the academy was chasing after the quest to prove the existence of aether for decades, even after the results of Michelson-Morely suggested they were barking up the wrong tree. Einstein spent the late period of his career trying in part to refute quantum mechanics, Newton tried to turn lead into gold. There was a fellow referenced in the This American Life episode "81 Words" who was a revanchist on the psychiatric community's stance vis-a-vis homosexuality, who felt that homosexuality was a disease of the mind, often caused by bad parenting, who up until his death in the 1990s offered people therapy to help them not be gay. He had a gay son. Once we are certain of something, if we care about it at all, it is very difficult to not be certain.
posted by Diablevert at 9:35 PM on April 18, 2011 [5 favorites]


"Believing" in science would be missing the whole point ... to make it so noone ever -has- to "believe" again.
posted by Twang at 9:40 PM on April 18, 2011


I believe we have the opportunity to get delightfully meta in this conversation :)

But those people are not representative, and it is silly to think that everyone works this way. We're all error-prone and emotional, but we knew that. Yet many people still manage to get righter when presented with the right info, enough time, and good people to talk it through with.

Let me suggest a different way of describing the same thing that you're describing:

Many (I'd say most) people's beliefs become closer to those of their peers, given enough time, and enough advocacy by their peers.

Because let's face it: people don't only get righter; sometimes they get wronger. (And our determination of whether they're getting righter or wronger is by no means shared by some kind of consensus that makes it clear which is which.)

There's some evidence for what you're talking about, for simple cases. Minority advocacy does (mostly) undo the effects of conformity on simple, clear tasks (show somebody a line clearly shorter than another line and put 'em around a bunch of actors saying, "It's not shorter!" and you'll see a conformity effect; but add another, single actor saying, "You guys are all crazy! Look!" and you can mostly undo that effect.)

But most of the things we're interested in looking at are not simple cases. Climate change? This is a perfect chance for exactly the kind of confirmation bias that the article describes; here, if you're curious, is a link to the study referenced:

Biased Assimilation and Attitude Polarization: yadda yadda yadda

This kind of confirmation bias protects people from the truth; it allows them to discount evidence that runs counter to what they already believe, and favor evidence that supports their belief. Now, in the absence of any evidence for a claim, we can expect most people to fall victim to conformity eventually, and abandon the "wrong" viewpoint for the "right" viewpoint. But if there's a little bit of evidence for the wrong viewpoint, confirmation bias works by letting the person reject all counter-evidence.

What does it feel like to be biased in this way? It's very easy to empathize with. For me, it might be like reading a homepathy journal: "It doesn't matter how many bad studies you have when they're all bad!" Is that confirmation bias? I'm not sure-- how could I ever tell? Maybe by seeing if my ideas are crazy? "Hey mainstream medicine, what do you think about homeopathy?" Conformity bias anyone?

So is it hopeless? I'm not really sure. I'm still working under the assumption that I can actually figure stuff out, even if nobody else can :)
posted by nathan v at 10:12 PM on April 18, 2011 [2 favorites]


Interestingly, this article mostly confirmed in my mind a bunch of things which I already believed. So I agreed with it and liked it.

As for the evolution thing, it was brushed over and poorly described, but I can make a stab at better explaining what Mother Jones was probably trying to get at.

The process of evolution favors pattern recognition both for reproducing and for survival. Once we reached the hominid part of our journey, this trait had graduated into system-creation, which is extremely helpful to us but favors the rejection of anomalous data.

Moreover (and mind you that I am just guessing about all of this) it would seem likely that particularly the reproductive aspect of evolution would favor those who claim certainty in their opinions and beliefs over those who suffer potentially lower social ranking for presenting themselves as uncertain.

Anyway, what I'd really like to see studies on is not confirmation bias or debunking bias, but what I can only describe as "Actually, would you believe..." bias. This is that trait of people (and I am definitely guilty of it myself) to hear or read a counterintuitive "fact" about something they thought they knew but had never given much thought or study to, and then accept that contrary fact as truth, refusing to disbelieve it later even if presented with evidence which shows that their initial assumption was right all along.

Call it the "Dan Brown" effect.
posted by Navelgazer at 10:53 PM on April 18, 2011


Call it the "Dan Brown" effect.

Cursory research doesn't reveal much, although I know that I've read about it before.

The Von Restorff effect says that the counter-intuitive is more memorable.

Availability heuristic errors is maybe what I read about. Once something is more easily accessed, maybe because it was remembered because it's counter-intuitive, you use it more frequently than less easily accessed information to assess things like probability. So you think homicide is more deadly than stomach cancer, for instance.

Once you have a belief like this, confirmation bias enters play.

I once heard somebody tell me that warm water freezes more quickly than cold water. This stuck in my head, but, being as I was a little smart-alec punk prick and thought I was smarter than everyone, I never really believed it. Finally, I ran into a description (in Kuhn's famous Scientific Revolutions whatever) of documentation by Bacon to this effect, with nobody ever able to replicate the experiment with his results. And that's why I'm still a little smart-alec punk prick :)

I think I'm going to try to email Charles Lord, author of the confirmation bias/death penalty study. I wonder if it's ever been critiqued on grounds of unfalsifiability.
posted by nathan v at 11:13 PM on April 18, 2011 [1 favorite]


Take cult members, for instance --- most of the research I'm aware of suggests that while there are common conditions that make people susceptible to being sucked into cults, there aren't necessarily common traits among people who join cults...

I don't see how that contradicts what I was saying. There's this tendency to think in terms of innate traits, and interpret others as thinking in such terms, which I wasn't doing. Cult members are non-representative not because they are innately gullible people, but because they are people who, just as you say, happen to have fallen into a world-view that makes them very resistant to changing their mind. That's almost what the definition of a cult is, and is what I'm talking about: not an innate tendency, but a largely learned set of beliefs and behaviors causing them to resist getting righter. Of course, there may also be innate tendencies influencing this, and of course all sort of events beforehand influence it too, but the point is, the situation of being stuck in a cult is not at all like being an average person.

Besides, most of what the new research is suggesting is that we're not, as individuals, very good at overcoming our own biases at all --- we feel a lot quicker than we think, and the think we do do tends to be shoved along in the direction our feelings lead us...

I still fail to see how this is "new" research. Who thought people were unbiased? Again, ask any parent or teacher or anyone who has argued with anyone -- people's minds are hard to change. We knew that. And we knew people are emotional. The new thing here is the essentialism that suggests that science can't overcome it because, look, scientists are often wrong and resistant to (correct) new theories for decades. But everyone besides utopian SF writers of the 50s knows that science has such problems. Yet it still has improved. Similarly, students are biased and emotional and stubborn, but over 16 years of schooling they get a lot righter, not just in facts but in the meta-process. I don't think this is controversial either. So what is this "new" research saying? That people don't learn easily or quickly? We knew that. The excited implication seems rather that we are hopelessly biased. Which is silly. Everyone will always be mostly wrong, but people can, in small but systematic ways, get less so.

I believe we have the opportunity to get delightfully meta in this conversation

The implication that I am enacting the very biases I argue against is fine -- I don't mind -- but it's not new either. For instance, it is the foundational move of Freudianism, which this all reminds me so much of. You act out of unconscious desire, and if you argue otherwise, that's just your unconscious misleading you yet again...

Because let's face it: people don't only get righter; sometimes they get wronger.

I'm not sure how you read me as suggesting people get only righter. My point is just that getting righter is not just the upward half of a random walk: certain behaviors and practices make getting righter slightly more common than getting wronger.

And our determination of whether they're getting righter or wronger is by no means shared by some kind of consensus that makes it clear which is which.

I don't know which consensus you mean, but this is wrong for many things. For instance, we have a consensus about what counts as correct addition, and that children start out with many wrong ideas about addition, and get righter about addition as they learn more. Many domains are less clear, but not all of them.

As for the confirmation bias paper and the climate issue, two points: a) Sure, people are biased in their assimilation of new data. But again: 1) we knew that, and 2) that doesn't mean there aren't systematic ways to get righter. And b) I'm rather sympathetic to the students in these sort of studies: for instance, I'm constantly being shown snippets of "information" by right-leaning economists who expect me to upend my entire belief system because they claim, say, to have shown that there's a .7 correlation between unions and unemployment in states. Existing opinions ought to be resistant to easy change, such as when a guy in a white jacket gives you a piece of paper saying that some dudes showed that murder rates were higher in 8 of 10 states with capital punishment. Those kids were right to resist that, though they were wrong to unquestioningly embrace the data that supported their existing opinions. On the other hand, in terms of critical thinking, wrestling with contradictory info is probably more important than questioning confirmatory info. But in any case, even I agree these studies demonstrate bias, that doesn't contradict (a).

What does it feel like to be biased in this way?

What a strange question. I presume it feels exactly like I feel every day as I work to assimilate and understand new information and shape my opinion based on it. What else would it feel like?
posted by chortly at 11:59 PM on April 18, 2011


You lead with the values—so as to give the facts a fighting chance.

Go Banana: Wise advice.

Yep, Made To Stick: Why Some Ideas Survive and Others Die by Chip and Dan Heath discusses similar ideas, in a chapter on the role emotion plays in persuasion. That sentence also brings this comment to mind:

"From the political right, we’ve seen quite a few morally driven initiatives, like abstinence-only education, that thrive despite their failure and their advocates’ disregard for facts. I don’t propose that progressives who are concerned about social justice, equity, and human rights abandon their attention to metrics or ape the right’s tendencies, but I think we have something to learn from them" -- namely, incorporating moral rightness back into our arguments instead of ceding "morality" to religious zealots.

This appeals to me. But then I remember that arguing against torture for moral reasons hasn't done very well against xenophobic tribalism.
posted by cybercoitus interruptus at 12:05 AM on April 19, 2011


The implication that I am enacting the very biases I argue against is fine -- I don't mind -- but it's not new either.

Yeah, that's not actually what I'm getting at-- because what's important to me is not just the idea that you're enacting the very biases that you're arguing against, but that I'm enacting the very biases I'm arguing for. And you're absolutely right that it bears a lot of resemblance to Freudianism, which is why unfalsifiability is such an important part of this conversation, but it's not some kind of cut-and-dried situation with a clear answer-- it's sort of paradoxical. That's why it's interesting.

I'm not sure how you read me as suggesting people get only righter. My point is just that getting righter is not just the upward half of a random walk: certain behaviors and practices make getting righter slightly more common than getting wronger.

Well, with complicated stuff, where there isn't consensus, we kind of have to abandon our closely held ideas of correct or incorrect, at least, if we imagine ourselves vulnerable to cognitive biases. And those biases are well documented by scientific research, which kind of relies on us, you know, being able to be free of bias. (When there is consensus, conformity stuff comes into play pretty heavily.)

So I think there's a really big difference in how we treat contentious stuff and how we treat non-contentious stuff. Talking about addition tables is fine, but it's not the same thing as talking about climate change denial.

If we want to limit the conversation to addition tables, I would argue that there's more than one pressure pushing us to the right answer (and I'm not going to bother putting it in quotes, because I happen to think it's right too). In history, we've got selective pressure. We have the failure of people who are poor at math to survive and prosper. But in the frame of current kids, that's not what's driving people to be good at math. We can explain the success of math education very easily in terms of social sanction and a natural urge to conformity. 2+2=5? Fine, but you get an F. Your parents judge you. Your peers think you're stupid. You'll figure it out quickly based on that kind of feedback.

Conformity is applicable if we talk about more modern, less consensual issues like climate change, but selective pressure isn't very applicable. By the time selective pressure is exerted, it's really too late. (Naturally selective, not sexually selective, which is a lot more like social sanction.)

Existing opinions ought to be resistant to easy change, such as when a guy in a white jacket gives you a piece of paper saying that some dudes showed that murder rates were higher in 8 of 10 states with capital punishment. Those kids were right to resist that, though they were wrong to unquestioningly embrace the data that supported their existing opinions.

All studies are guys in white jackets. They rely on that same kind of authority. I don't know that I need to say that, but just in case....

Yeah, it was a mistake to interpret the exact same data differently depending on whether it agreed with one's pre-existing beliefs. But to treat these two situations (agrees; doesn't agree) independently is to ignore the fact that all of our important decisions exist only in the face of people telling us to lean both ways; ignores that we have to add conflicting info to reach a decision. Experience delusive, judgment difficult, right? Thinking of the data represented by the capital punishment study as being the addition of a big error with a little virtue obscures the fact that these virtues and errors are inextricably linked, related.


What a strange question. I presume it feels exactly like I feel every day as I work to assimilate and understand new information and shape my opinion based on it. What else would it feel like?

I'm sorry, that was a rhetorical question, to introduce the fact that there's not really any way for any of us to tell whether we're vulnerable to cognitive biases or not. I think you're right, it feels just like honest accurate judgment. It turns out, after all, that psychology professors are as vulnerable to cognitive biases as anybody else.

I want to add that, yes, I agree, nothing in this article is new, not by any measure. But it's still really interesting :)
posted by nathan v at 1:02 AM on April 19, 2011 [1 favorite]


I suspect this view of "bias" is as logically incoherent as conceptual relativism. (Though I may be misunderstanding you.) Either one believes there is (about some things) a truth out there and bias means a tendency to miss it, or you believe there is only social construct, social pressure, conformity, and so forth, in which case there is no "bias" because everything is as "biased" as everything else. Most of the scientists doing these studies clearly believe the former, though. They are showing how people have mistaken strategies, strategies that lead them away from the truth, or at least prevent them from moving towards it even when the opportunity presents itself. But if you do believe that, then clearly there must also be strategies for avoiding "bias" and moving towards the truth more efficiently; moreover, obviously you must believe in the existence of truth too, independent of selective pressure, social sanction, and conformity. Otherwise, these papers would just be saying: look, people are/aren't conforming! Which is even less interesting than claiming that people often have bad strategies for figuring out the truth. (Which in turn is even less interesting than saying that figuring out the truth for hard stuff like climate change is hard.)

We can explain the success of math education very easily in terms of social sanction and a natural urge to conformity. 2+2=5? Fine, but you get an F. Your parents judge you. Your peers think you're stupid. You'll figure it out quickly based on that kind of feedback.

Similarly, I suspect these sorts of accounts of learning run afoul of all the problems the behavioralist program did in 1940s and 50s. Operant conditioning just doesn't explain how people learn stuff, like addition; they learn the algorithms by rote, perhaps, but those algorithms work and are learnable and extensible because they are logically coherent and mathematically true.
posted by chortly at 1:40 AM on April 19, 2011


The person who embraced the thought "I bet there is a lion behind EVERY rock" surived too, despite most rocks not having lions behind them.

And he went on to make great movies like Annie hall.
posted by AndrewKemendo at 3:35 AM on April 19, 2011


"Festinger and his team were with the cult when the prophecy failed. First, the "boys upstairs" (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived..."


So, they were kinda like the followers of Jesus then, after he was killed stone dead, never to rise again?!
posted by markkraft at 5:24 AM on April 19, 2011


Relevant Tim Minchin animation.
posted by empath at 6:05 AM on April 19, 2011 [1 favorite]


This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, "death panels," the birthplace and religion of the president (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts....Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information

Wrong. The problem is buried in this passage. "the evidence is so unequivocal." It is? According to who? Most people have no ability to evaluate this evidence on their own. They don't understand the science, nor would they recognize dome datum as evidence even if you pointed it out to them. So to say the evidence is "unequivocal" means that "scientists think evidence is unequivocal" which means that the person has to believe the scientist who tells them the evidence is unequivocal.

That's why the second part of that passage is so terribly wrong. It isn't a negative response to the information, it's a negative response to the person or entity delivering the message.

In fact, this entire article avoids the fundamental problem, which is that people without STEM backgrounds (which is most people) don't believe science because they don't understand it, and therefore fall back on evaluating the information non-scinetifically, based on their personal perception of the authority, credibility, and reliability of the entity delivering the information. But at least the article acknowledges that for these people, science functions as belief (self-link).
posted by Pastabagel at 6:25 AM on April 19, 2011 [1 favorite]


This has a lot to do with it, too. People choose their authorities (scientific, political, religious, otherwise) because those "authorities" pander those people's biases. Purposefully or because of their own biases.

And then there is the willful sowing of mistrust, which fits in nicely with science's troublesome nature to refine, review and test. Yesterday's pioneer is today's disproved sap. For some, that's not evidence of progress, but of incorrectness. "If they were wrong then, who is to say they aren't wrong now? Let's do nothing."
posted by gjc at 6:59 AM on April 19, 2011


In fact, this entire article avoids the fundamental problem, which is that people without STEM backgrounds (which is most people) don't believe science because they don't understand it, and therefore fall back on evaluating the information non-scinetifically, based on their personal perception of the authority, credibility, and reliability of the entity delivering the information. But at least the article acknowledges that for these people, science functions as belief (self-link).

I think "these people" are "all people," most of the time. Even someone trained in the scientific method, expert in a scientific field, is not expert in all fields. If you ask a nuclear physicist whether he thinks a new theory of ocean ecology is bunk, his opinion is going to depend in part on all those "is the messenger trustworthy" criteria you mention, including things like "was the theory published in a peer-reviewed journal."
posted by Diablevert at 7:53 AM on April 19, 2011


The article seems pretty balanced to me, but there was considerable Us vs. Them tightrope walking. Psychology uses binomial categories such as "communitarian vs. individualist" and "egalitarian vs. hierarchical" as a way of describing tendencies (as opposed to inherent traits). Outside of scientific discussions, however, such terms become problematic. People do like for things to fit into categories. Look at the Myers-Briggs believers. MBTI is a decent tool to describe tendencies in personalities, but "believers" think they can use it to predict anyone's behavior, and go so far as to say things like "personality type never changes" to rationalize their desire to stereotype everyone they meet.
posted by zennie at 8:10 AM on April 19, 2011


Timothy Burke has, as always, very clever things to say about this. In short, people have good reason to be skeptical of science (up to and including Mooney's neurological arguments, but that's not the main thrust of his argument) and scientists should be more willing to recognize that.
posted by col_pogo at 3:30 PM on April 27, 2011


« Older The KO Hip Hop Cello-Beatbox Experience   |   Sharing the road Newer »


This thread has been archived and is closed to new comments