Tribal Politics
January 12, 2019 8:44 PM   Subscribe

How Tribal Psychology Makes us Prefer being Wrong. "...and Geoffrey Cohen — this is my favorite experiment that was ever done — he gave people a position on welfare and experimentally altered it so that either the Republicans or Democrats were saying basically the same thing on welfare....And what he found was that he could get people to change their position on welfare, 100 percent, all the way to the other side of the spectrum of policy, just based on what party they were told supported that position.... "

"....And the crazy thing is that after they said they supported that position, he asked them why they supported that position, and they didn’t say, 'Because my party does.' They came up with other reasons. So, after being experimentally induced into holding a position that they actually didn’t agree with, they then came up with reasons that they thought they agreed with that."
--

"In one of his studies into this phenomenon, [Dan Kahan] had people who self identified as being Democrats or Republicans take a look at a climate expert’s credentials. They saw the image of a man named Robert Linden, a very professional, academic-looking older gentlemen, and they read that he was a professor of meteorology at MIT, that he earned his doctorate at Harvard University, and they then learned he was a member of the American Meteorological Society and the National Academy of Sciences.

When Kahan asked these subjects if this gentleman was indeed an expert on global warming, everyone agreed that he was. Then people on both sides received one of two statements supposedly made by this expert which, in truth, were manipulations created by the scientists. For some people, he said that the research had led him to conclude that global warming was real and human-caused. For the others, he said his research led him to conclude that global warming was not real and humans were not causing anything bad to happen to the environment.

What Kahan found was that when Republicans heard that Robert Linden, professor of meteorology at MIT, believed that climate change was real and human-caused, they no longer saw him as an expert. Now they said it was clear he was a quack. Likewise, if Democrats learned Robert Linden, professor of meteorology at MIT, did not believe global warming was real. They too no longer saw him as an expert. They too saw him as a quack. Only when Robert Linden’s position matched that of their affinity group, their tribe, did Robert Linden continue in their minds to be an expert."
posted by storybored (72 comments total) 36 users marked this as a favorite
 
Well this is sad but facts still matter :(
posted by sjswitzer at 8:49 PM on January 12, 2019 [8 favorites]


Welfare is a complex, multi-layered issue (I fall on the side where making the restrictions so tight that you prevent most cheating but also denying legit people in need is unacceptable), and there are few unambiguous truths in its implementation.

There is a lot of room for worldview and opinion on how to best deal with poverty. But that climate change question is ridiculous. It's like asking the participants of the study to decide if an economist is a quack for denying the existence of poor people. Any economist who claims there is no poverty is as much of a quack as a scientist in 2019 who is a climate skeptic.
posted by tclark at 8:58 PM on January 12, 2019 [31 favorites]


Trump has been conducting this experiment on my dad for a while now.
posted by vorpal bunny at 8:58 PM on January 12, 2019 [54 favorites]


I am extremely suspicious of studies based on changing the labeling of identical statements between "a Republican said this" and "a Democrat said this" and seeing how people's reactions change.

No duh?

Though both parties lie they tell different lies, and to different audiences. The same words don't mean the same thing and that's just 101-level decoding what Trump's statements mean for my pre-existing condition's future (nothing good, as predicted).

Also, I'm going to push back on the idea that both sides are equally guilty of going against the science on fact-based issues. Can't imagine many actual scientists being on board with that.
posted by traveler_ at 9:11 PM on January 12, 2019 [31 favorites]


Some of these studies have fairly apparent flaws.

When banks put forth regulatory proposals, I am automatically suspicious of them and inclined not to support them. Why? Because my experience tells me that banks have certain goals with which I violently disagree and that they are fairly adept at putting forth policy proposals that disguise their ends with other, apparently more palatable, ones. So if you tell me of such a policy proposal, my initial take on it is going to be different than if you told me Elizabeth Warren put it forward. Is this because I'm tribally-minded, or because I didn't just fall off the turnip truck?
posted by praemunire at 9:11 PM on January 12, 2019 [92 favorites]


When Kahan asked these subjects if this gentleman was indeed an expert on global warming, everyone agreed that he was.

This seems like it's hiding a lot. I can't agree someone who I've never met is an expert based just on what somebody tells me (particularly in a psych experiment). Does he look like an expert? Yeah, sure, I guess. It's like if you showed me a guy in a spacesuit and then asked if he was an astronaut. I'd agree right up until the moment you told me he was a flat Earther, then I'd reexamine the facts in evidence and assume that either he's not an astronaut, or that he's a quack astronaut.

It's 2019 and I'm tired of this false equivalency thing. Human beings have to believe in some things they haven't experienced personally in order to function in the modern world. I believe Madagascar exists even if I've never been there. I believe climate change is real because an overwhelming majority of climate scientists agree that it is. You can dress an actor up like anything but that doesn't mean that the words that come out of someone's mouth aren't coherent with objective reality or not.
posted by axiom at 9:18 PM on January 12, 2019 [75 favorites]


But that climate change question is ridiculous.
Maybe I just feel like ranting, so grain of salt, but:

It seems that this kind of "science" (and not the idea that humans are tribal but that showing a picture and listing off some credentials and then assuming those credentials make someone an unabashed expert) are dirtying the waters of real credentials as well, and social science should perhaps start considering that? Gun control stats = face cream/climate science with made up experts/whatever it's all just a weird game to these people. No wonder people are getting blase about higher education and about 'experts' knowledge.

It also seems that they are seriously overvaluing a single person's opinion - they don't say their their opinions about climate change change, because that info is verified by many scientists, not just one - but just that their opinion about that one scientist changes. I don't really see that much difference between that and holding the opinion that Tom Cruise is both a great action actor and a religious weirdo. Info builds on info, news at 11.

Also, "quack" seems like a loaded term, and people's opinions about individual people can change, both with time and with experience.

I don't doubt the idea that people can be ideological, and that many policies are ingroup/outgroup based, but hey, let's just be careful with the test cases here, ok?
posted by The_Vegetables at 9:20 PM on January 12, 2019 [5 favorites]


So Robert Lindzen is an interesting case. He is a real climate expert with impressive credentials. He also disagrees with mainstream climate science, which could qualify him as a quack, or a maverick. I don't have the knowledge to engage his ideas, but yet I disagree with them because the scientific consensus is likelier to be correct.
posted by Mr.Know-it-some at 9:20 PM on January 12, 2019 [5 favorites]


I thought this was going to be about how a lot of the time at work, having a different opinion than everyone else makes me feel nervous, and I'd rather find out I was wrong and thus be back into the comforting folds of the group think, rather than be right and remain an Other. Because at work having a different view absolutely makes you an Other, which is way more dangerous & important than the facts.
posted by bleep at 9:20 PM on January 12, 2019 [10 favorites]


For the others, he said his research led him to conclude that global warming was not real and humans were not causing anything bad to happen to the environment.

The problem with this is that global warming is real. The earth's atmosphere has historically gotten hotter. I don't even think it's disputed. It would be sort of like revealing to me this expert had found out the world was flat, at which point... yeah, I would probably be quick to say he was not a real expert.
posted by xammerboy at 9:22 PM on January 12, 2019 [29 favorites]


I never understand the point of these things. If I believe X instead of not-X, and then some person A says not-X, then isn't it reasonable to trust A less after that? And if I don't know about X, but I trust A and not B, then isn't it reasonable to believe X if A says X and B says not-X?

Trust is not tribalism. It may be erroneous to trust a specific group, or believe a specific fact, but it's not a mistake to judge facts based on who utters them, nor to judge groups based on which facts they utter. It's a mistake to trust Republicans or to believe that climate change is a hoax more strongly than you believe in an MIT professor, but that's not in itself an illustration of a failure of reasoning -- it's just an illustration that some people trust and/or believe the wrong thing.

Suggesting otherwise -- that it's a fundamental mistake and example of dumb tribalism to trust political parties or your own beliefs more than some random scientist or psych experimenter -- is itself anti-intellectual. What do they want, for us to instantly abandon everything we have learned and everyone we have learned to trust the instant some research assistant tells us to believe something else because an "academic-looking older gentlemen" said so?
posted by chortly at 9:24 PM on January 12, 2019 [18 favorites]


It's really an interesting subject that goes beyond the pull quotes. I mean there's this section on how readily people form group affiliation:

Going even more simple, Tajfel told people ahead of time they would be sorted randomly into groups with names like Group 40 and Group 15, and still knowing they were randomly assorted, people exhibited favoritism toward their imaginary ingroup and bias toward their imaginary outgroup.

I find it all quite believable from my own history, but that's surely also due to how I've always loathed group identification of any sort I suppose.
posted by gusottertrout at 9:31 PM on January 12, 2019 [13 favorites]


I mean, we're kind of trained to do this from school age, especially in an academic setting. If you split the class up into groups, we identify with our own group, especially when the stakes are low. Hell, lots of time the next step is to pit the groups against one another, and even in a made-up competition why wouldn't I root for my own group? Do I think there's kernel of tribalism there? Yes, but it's not surprising or particularly illuminating.

If you randomly sort us into groups and then try to order us to fight to the death, I'm suddenly feeling like opting out of your psych experiment is the best option, you know?
posted by axiom at 9:37 PM on January 12, 2019 [9 favorites]


and even in a made-up competition why wouldn't I root for my own group? Do I think there's kernel of tribalism there? Yes, but it's not surprising or particularly illuminating.

If you randomly sort us into groups and then try to order us to fight to the death, I'm suddenly feeling like opting out of your psych experiment is the best option, you know?


Why should you root for either group though in a made up competition? What do the others have to do with you? And as nice as it is to think opting out is an option, we don't always get to choose whether we're in a group or not, that role is as often assigned to us as it is a choice.

Any time a group is formed, there is automatically another made as well, those who are identified as not fitting the chosen parameters. Every group, in a sense, creates its own "enemies" in that way. That, at least, is the feeling I've had when included in a group, whether by choice or otherwise. The emotional demands shift towards support for the group rather than just my own belief in what is best, for good or ill.
posted by gusottertrout at 10:09 PM on January 12, 2019 [9 favorites]


I'm inclined to think this so-called experiment is a sham and a delusion cloaking an attempt to rehabilitate the reputation of Robert Lindzen, the Global Warming denier, who is in fact a fraud and a crank, by trying to imply that the people who disagree with and now depise him are merely members of a different tribe.

But that's not true.
posted by jamjam at 11:11 PM on January 12, 2019 [25 favorites]


Focusing on what was actually said in the interview, I think it's interesting that Kahan says there are issues where the usual science communication method works well, but issues (like global warming, vaccines, etc.) where it fails. But having skimmed the article I didn't catch what are the examples where it works well? It was a long article and the interview format is kind of hard to read carefully and it would be nice to have in mind what those examples are.
posted by polymodus at 12:07 AM on January 13, 2019


God dammit. No.

Tribalism, is it real? Sure. But check this out - I'm American, but I've lived in China for a good chunk of years and speak the language, and I own a Huawei phone or five. I work for multiple clients one or the other government would find less than savory, and they pay me equally well, and I own property in neither. I agree with the statements "I support China's peaceful rise" and "I think American hegemony has overall been a positive force in the world".

Based on the above, do I support the extradition of Huawei's founder's daughter's extradition to the US from Canada?

Beyond just the outright stupidity or fallibility of people who do or claim to believe untrue things, your "tribal affiliation" doesn't answer your stand on detailed, important questions. At most, it predicts who has lied to you lately, because truth is more complicated than tribe.
posted by saysthis at 12:11 AM on January 13, 2019 [6 favorites]


Beyond just the outright stupidity or fallibility of people who do or claim to believe untrue things, your "tribal affiliation" doesn't answer your stand on detailed, important questions. At most, it predicts who has lied to you lately, because truth is more complicated than tribe.

I’ll try to dig up the sources I read on this, but when I took a political science class we spent a long time talking about how group affiliation actually plays a huge role in forming opinions on complex topics. The basic gist is that when people are asked about something, they’re often forming an opinion in response to the question, rather than because it was arrived at ahead of time.

What I mean is, there are a lot of things we don’t really have an opinion on until we’re asked to choose. It takes a huge amount of thought and effort to form an opinion ahead of time, and it’s impossible to do that with all things. So we turn to useful mental shortcuts, like the opinions of people we’re likely to agree with. If we hear that our political party supports something, it’s not unreasonable to have a more favorable view of it.

It’s not necessarily a bad thing, it’s just how we organize and synthesize information. We rarely even notice it, because we do it all the time. It just becomes a problem when those cues are potentially misleading (personally, I wondered why so many democrats supported drone strikes under Obama). It’s not about the truth itself, it’s about how we decide for ourselves what’s true — because it’s not possible to be enough of an expert in all things that we can truly evaluate all the evidence required.
posted by shapes that haunt the dusk at 1:14 AM on January 13, 2019 [18 favorites]


Given all the research demonstrating that this is a universal part of the human condition in all societies, perhaps it’s time to retire the word ‘tribal’ for this phenomenon.
posted by Bloxworth Snout at 2:00 AM on January 13, 2019 [14 favorites]


I think the fundamental flaw in this research (as it is described in the interview), is to not realize that we elect leaders to lead. Yes, we organize ourselves in groups, and that is a good thing, because groups are more able to make decisions together than random blur, and back in the origin days, that was necessary for survival. And then in those groups, we often elect a leadership, to whom we delegate some responsibilities, so not everyone has to think about everything all the time. This goes way back - animals form groups and elect leaders too. For this purpose, it isn't relevant wether we elect them democratically, though a lottery or by having fights. If you will not acknowledge that there are groups and leaders and that this form of organisation predates modern humanity, then doing research into how those groups function is going to be flawed.
Of course I trust the leaders we have chosen in my group. I'm hardwired to do so. The problem is not my trust, it's their lies. When the people we have chosen as leaders betray us by lying all the time, the whole system breaks down. These experiments are not experiments that show how we naturally organize ourselves in groups, they are studies in how to corrupt groups with bad leadership, where the fault in leadership is lying.
In practice, in a modern society, sometimes you do have to twist your brain in order to defend your choice of leadership, it sucks, but there you are. We are not hunting on a savannah anymore. As a liberal person, I have struggled with Obama's drones, though mostly I have pretended I didn't know about them. The perfect is indeed the enemy of the good.
The thing is, however, that for more than a generation, conservative leaders have weaponized this reality/practice, forcing the group of people who have a conservative mindset to believe impossible things every day. To deny science, to believe in voodoo economics, to believe in all sorts of hate-mongering and false threats. It has become a mark of pride for conservatives to believe in rubbish and even fight for it. (And this is in itself a fascist practice).
In theory and in the experiments here, the same can happen on the left. And indeed it did once - I remember it well from the 70's, but it went on from the 1930's onward. People would twist themselves into the strangest shapes in order to defend communist dictators. I never figured out why they couldn't just admit that totalitarianism is bad and social democracy is good. But they couldn't and they didn't.
In our current situation that isn't a thing. The conservative lies are a thing. Making silly "experiments" to both-side it is just stupid, and maybe a bit evil too.
posted by mumimor at 2:43 AM on January 13, 2019 [15 favorites]


Then people on both sides received one of two statements supposedly made by this expert which, in truth, were manipulations created by the scientists. For some people, he said that the research had led him to conclude that global warming was real and human-caused. For the others, he said his research led him to conclude that global warming was not real and humans were not causing anything bad to happen to the environment.
These days I'm automatically suspicious of social psych research, especially where it has any kind of political tinge, because I've seen so much of it that is absolutely terrible. Even where the basic theory - that "tribal identification" makes people do silly or nasty things to outsiders - seems reasonably sound. And the above quote made me particularly suspicious, because those two statements are not at all symmetrical - there is no equivalent of "and humans were not causing anything bad to happen to the environment" in the first, and it obviously puts the second statement well into crank territory.

So I tried to track down the paper they're talking about here; it seems to be (PDF warning) "Cultural Cognition of Scientific Consensus" by Dan M. Kahan, Hank Jenkins-Smith and Donald Braman.

Here's the full text of the first statement:
“It is now beyond reasonable scientific dispute that human activity is causing ‘global warming’ and other dangerous forms of climate change. Over the past century, atmospheric concentration of carbon dioxide (CO2)—called a “greenhouse gas” because of its contribution to trapping heat—has increased to historically unprecedented levels. Scientific authorities at all major universities agree that the source of this increase is human industrial activity. They agree too that higher C02 levels are responsible for steady rises in air and ocean temperatures over that period, particularly in the last decade. This change is resulting in a host of negative consequences: the melting of polar ice caps and resulting increases in sea levels and risks of catastrophic flooding; intense and long-term droughts in many parts of the world; and a rising incidence of destructive cyclones and hurricanes in others.”
That's pretty reasonable; you might wonder about what things like "scientific authorities at all major universities" might mean, but there's nothing seriously wrong with it. What about the second?
“Judged by conventional scientific standards, it is premature to conclude that human C02 emissions—so-called ‘greenhouse gasses’—cause global warming. For example, global temperatures have not risen since 1998, despite significant increases in C02 during that period. In addition, rather than shrinking everywhere, glaciers are actually growing in some parts of the world, and the amount of ice surrounding Antarctica is at the highest level since measurements began 30 years ago. . . . Scientists who predict global warming despite these facts are relying entirely on computer models. Those models extrapolate from observed atmospheric conditions existing in the past. The idea that those same models will accurately predict temperature in a world with a very different conditions—including one with substantially increased CO2 in the atmosphere—is based on unproven assumptions, not scientific evidence. . . "
It should be immediately obvious to anyone reading this who has followed climate change at all that the author of the second extract is a crank. Even if some of these facts were technically true whenever the research was done (around 2010? Not sure) they are the kinds of "facts" that dishonest or delusional climate change denialists trundle out all the time. Why is 1998 being used as a base year? How many of the glaciers are "actually growing"? Why just talk about the ice "surrounding Antarctica"? (land and sea ice are very different). Then there's "Scientists who predict global warming despite these facts are relying entirely on computer models" which is just stupid.

The second set of statements, about nuclear waste, is about as bad (the third, about gun control, is a little different in that both statements are terrible, but I think the pro-concealed-carry one is still worse).

This experiment fits the bad social psych research pattern: a superficially plausible theory but little or no effort put into identifying alternative hypotheses that might also explain the results, and an experimental method that seems almost (almost) designed to introduce confounding factors that push the results towards the theory.

That said, this paper wins a few points by actually including the text used in the experiments; a lot of them don't, so you can only speculate about the methodological sins being concealed.

Reading the linked transcript was annoying, because I kept waiting for the interviewer to experience a moment of mild insight and ask something like "what about if one side of the 'tribal identification' is just consistently delusional?" Like, if I was interviewing someone about witch trials, and they were talking about how it was the tribal identifications of the pro-witch trial and anti-witch trial people that kept them from getting together to reach a sensible compromise, I'd like to think that at some point I'd try to gently introduce the possibility that there might be other factors involved. Not this guy, though.
posted by A Thousand Baited Hooks at 3:31 AM on January 13, 2019 [30 favorites]


It's a shame the experiments profiled were so incredibly duff, as in-group out-group behaviours have actually been intensely studied in many disciplines, and the findings are largely indisputable, and can be somewhat horrifying. Robert Sapolosky, in his excellent book, Behave, profiles some of this.

It is certainly true we unfairly favour those we identify with. The trick, as Sapolsky notes, is expanding our definition of in-group to encompass all humanity, or even Earth wherever possible.
posted by smoke at 3:50 AM on January 13, 2019 [6 favorites]


This experiment fits the bad social psych research pattern

I have started to mentally categorize social psych, along with evo psych, as “scientism.” It helps.

The trick, as Sapolsky notes, is expanding our definition of in-group to encompass all humanity, or even Earth wherever possible.

How hard would it be to fake aliens, then? Because we got shit to do, and limited time to do it in.
posted by schadenfrau at 4:15 AM on January 13, 2019 [6 favorites]


I'm just comparing two essays written by Kahan and Sapolsky and it looks like their contributions basically differ; Sapolsky is saying that identity behaviors happen at a hormonal level and Kahan is talking about the cultural levels. Which is why Kahan suggests the intervention is to encourage this thing he identifies a science curiosity, as opposed to science literacy. I think this shows how their approaches differ even though their empirical work have some overlap.
posted by polymodus at 4:20 AM on January 13, 2019 [1 favorite]


But... calling someone a quack for not “believing” in anthropogenic climate change regardless of his scientific background or employment is the sensible thing to do? The scientific consensus is pretty clear on that one.

I would say the exact same thing if presented with a physician who, despite otherwise respectable credentials, turned out to actually be an anti-vaxxer.

This is not people changing their position based on party affiliation, this is people changing their preconceived assumptions regarding an expert once those expert’s views on a scientifically non-controversial topic come to light.
posted by lydhre at 5:46 AM on January 13, 2019 [15 favorites]


Thanks for this post. I'm amused by the defenses of tribalism-over-truth I'm seeing here. It was this comment by Miko which made me aware of group dynamics theory, and it explains a lot of what I see in political discussions everywhere from Metafilter to Slate to National Review to Free Republic. We really don't want to agree with our political "enemies" on anything, or admit that they agree with us on anything.

It's like any bridge-building is a bridge to a malaria swamp that will infect us all.

I'm trying to find a couple of pieces of research I heard about in the past couple of years. Not having luck so far. In one of them, researchers got people to rate Clinton and Trump on various scales after one of their debates - how truthful are they? how intelligent are they? how trustworthy are they? that sort of thing. The responses were mostly extreme, i.e. 1 or 10 on a scale of 1-10, and directly tied to party support. Then the researchers modified and moderated some of the responses - they were magicians, IIRC - and asked each respondent questions like, "You're a strong Trump supporter, but I see here [on the faked response sheet] that you rated Clinton as a moderately hard worker. Why?"

People would immediately come up with perfectly reasonable explanations for "their" moderate responses.

I think most of us are able to have multiple conflicting ideas in our minds at a time, but we're loathe to admit it in a group setting. We're able to see that some of the things we want will harm our ability to get other things we want, but we don't want to admit that in a group setting, either.

Welfare and climate change are a great example, when you put them together. Cutting down on our use of fossil fuels is required to fight climate change. It will also, on average, hurt poor people the most, since massive use of fossil fuels is the cheapest way to provide food, clean water, and shelter. I doubt we'd be able to have an honest, thoughtful discussion about that.
posted by clawsoon at 6:11 AM on January 13, 2019 [12 favorites]


Likewise, if Democrats learned Robert Linden, professor of meteorology at MIT, did not believe global warming was real. They too no longer saw him as an expert. They too saw him as a quack. Only when Robert Linden’s position matched that of their affinity group, their tribe, did Robert Linden continue in their minds to be an expert."

I'm not an expert on climate science. I don't know and haven't verified the experimental data on climate change. But I do know that somewhere between 97 and 99% of all scientists who have studied climate change believe that it is real and anthropogenic. My experience tells me that this means that they are probably right and the remaining 1-3% are the quacks.

So the group I'm happily aligning with here is not Democrats, but scientists.

Now, what would be interesting is repeating this experiment with scientists who self-identify as Republican. (Or are there not enough of those around to make this empirically meaningful?) Will they align with the Republican group or with the scientist group?
posted by sour cream at 6:47 AM on January 13, 2019 [7 favorites]


Also, I'm going to push back on the idea that both sides are equally guilty of going against the science on fact-based issues. Can't imagine many actual scientists being on board with that.

Yet they were. Why did BOTH sides change their views in this simple experiment? Where does the fault lie with the methodology? Each side equally believes only the other side is biased toward their party line. Of course our own conclusions are not susceptible to bias.
posted by waving at 6:51 AM on January 13, 2019 [4 favorites]


I doubt we'd be able to have an honest, thoughtful discussion about that.

You put a chip that big on your shoulder, you're not looking for thoughtful discussion.
posted by PMdixon at 7:07 AM on January 13, 2019 [5 favorites]


So if you tell me of such a policy proposal, my initial take on it is going to be different than if you told me Elizabeth Warren put it forward. Is this because I'm tribally-minded, or because I didn't just fall off the turnip truck?

What happens when someone tells you that Elizabeth Warren was a Republican before she became a Democrat?
posted by srboisvert at 7:11 AM on January 13, 2019 [3 favorites]


PMdixon: You put a chip that big on your shoulder, you're not looking for thoughtful discussion.

Not sure what you mean by that. Perhaps I've spent too much time reading the megathreads, but I made what seems to me like a reasonable prediction about a hypothetical Metafilter discussion. There are some topics that Metafilter "doesn't do well", as we say, and most of those happen when two issues that we care deeply about intersect in challenging ways.
posted by clawsoon at 7:30 AM on January 13, 2019 [4 favorites]


Enter Shikari - Tribalism
posted by glonous keming at 7:46 AM on January 13, 2019


I'll give an example that's less politically charged:

Imagine I tell you that Seymour Butz is a professor of linguistics at a respected institution that you recognize - let's say Harvard. He earned his PhD at Yale. I show you a picture, and he looks just like you expect a respectable senior linguist to look: Neatly kept greying hair, blazer, glasses. I tell you that he is a member of the Linguistics Society of America, and that he is the author of many scientific articles. Does this man sound like an expert? I expect that you would say yes.

Then I give you this piece of writing by Seymour Butz:
The Indo-European language family is not a valid language family. It is a remnant of European colonialist thought that contemporary linguists support by mistake. We do not have any recordings of the Proto-Indo-European, its supposed ancestor language, which means it is impossible to know what it sounded like. It is a fiction. All European languages are in fact descended from Phoenician, which I will show through comparison of important vocabulary and letter forms. Indo-Aryan languages have a separate ancestry, and their origins can be traced to ancient holy mantras of Hinduism.
I just gave you important information about Seymour Butz's qualifications and trustworthiness. In this paragraph you have learned that: (a) he disagrees with the overwhelming consensus in his field, (b) he does not understand the data or the methods used to reach that consensus, and (c) he seems to have political - or spiritual - reasons to oppose the consensus.

I would expect that anyone with cursory knowledge of historical linguistics would say, "Actually, that guy sounds like a quack." Your opinion of him should change; it should be updated given this new information.

It's fine to discuss how our affiliations influence our opinions - sometimes to the point of denying things that would be obvious to us otherwise. It's probably pretty important to. But this part of the experiment was just poorly designed. You can't identify why the Democrats' opinions of the fake climate scientist shifted: It could be because Anthropogenic Climate Change is Real (TM) is a belief shared by their affinity group. It could be that they know anthropogenic climate change is the scientific consensus, and they were not as exposed to (or as vulnerable to) the Republican misinformation campaign. It's probably a combination of these things. There's a huge confounding factor.

In order to show that Democrats' opinions are equally as informed by "tribalism" as Republicans using this set up, you would also need to test an equivalent piece of widespread, politically-charged denialism that Democrats engage in. If that's hard to come up with, that also tells you something important.

And no, this comment is not a defense of "tribalism over truth." It's the exact opposite: It's a defense of the idea that there is a truth and that the truth matters. Conflating the overwhelming scientific consensus on climate change with climate change denialism into "political opinions" obscures the truth. It hands ammunition to the deniers, who already say "you just believe that because you're a libtard." Considering that climate change is shaping up to be the worst humanitarian disaster we have ever faced, that's not just a small problem with the methodology.
posted by Kutsuwamushi at 7:56 AM on January 13, 2019 [31 favorites]


When banks put forth regulatory proposals, I am automatically suspicious of them and inclined not to support them.

This is a big chunk of Skip Lupia's _Democratic Dilemma_ except he looked at insurance referenda/initiatives. The basic idea was as you describe it -- citizens mostly don't need to know the technical ins and outs of policy because they can make reasonable shortcut inferences from who they see support and oppose something.

but when I took a political science class we spent a long time talking about how group affiliation actually plays a huge role in forming opinions on complex topics. The basic gist is that when people are asked about something, they’re often forming an opinion in response to the question, rather than because it was arrived at ahead of time

That's the Zaller receive-accept-sample model.

The gist is that most people don't have pre-existing attitudes on most political matters, so they have to create their opinion statements on the fly based on whatever's whirling around in their head. That's the "sample" part at the end.

What that stew of stuff in your head looks like is determined by the random bits of information you're bombarded with (receive) and which of those you retain in memory (accept). The place where partisanship etc comes into play is that you're more likely to accept information that (you think) conforms to your partisanship.

It ends up mattering because what we care about are people's attitudes, because those attitudes are what drive other observable political behavior. But we never get to see attitudes because they're internal and mental; all we ever get to see are opinion statements that are loosely related to attitudes.
posted by GCU Sweet and Full of Grace at 8:03 AM on January 13, 2019 [10 favorites]


I just gave you important information about Seymour Butz's qualifications and trustworthiness. In this paragraph you have learned that: (a) he disagrees with the overwhelming consensus in his field, (b) he does not understand the data or the methods used to reach that consensus, and (c) he seems to have political - or spiritual - reasons to oppose the consensus.

I would expect that anyone with cursory knowledge of historical linguistics would say, "Actually, that guy sounds like a quack." Your opinion of him should change; it should be updated given this new information.


I think your example works differently for me, Kutsuwamushi, but also in a useful way. I know very little about linguistics. Really, all I got out of your paragraph was (a) and a desire to go learn more so I could figure out if for instance something like (b) and (c) were true.

This is how I tend to respond to things I haven't heard about before: I'd like to get at the truth. I don't know enough to do so. I'd like to find out more. When it comes to climate change, or really any issue on earth, there are an awful lot of people who have never had that response to anything in their lives. They are not interested in the truth. That is the real blight on our society, not the 'shocking' claim in these articles that people change their minds about things when given more information.
posted by hydropsyche at 8:08 AM on January 13, 2019 [1 favorite]


Really, all I got out of your paragraph was (a)

I'm not surprised. Most people don't know much about linguistics - just like most people don't know much about climate science. I think that (a) should be enough to shift most reasonable people's opinions, or at least raise their suspicions. The overwhelming consensus among experts can be wrong, but it's much more likely that the lone iconoclast proposing radical new ideas is wrong. When you don't know much about a field it's a useful heuristic; it's completely reasonable to use it.

With climate science denialism, you actually have additional information: it's politically motivated. I'd expect a reasonable person to use that information too.

I tried to work ideological motivations in with the references to "colonialism" and "mantras", but engaged in a bit of a false equivalence myself; colonialism is a real issue that has actually led to experts saying wrong stuff. The mantras bit is more obviously quacky.

(Actually, I took some of this out of the comment to shorten it....)
posted by Kutsuwamushi at 8:25 AM on January 13, 2019 [8 favorites]




Where does the fault lie with the methodology?

Imagine I want to find the average temperature of Earth. So I measure a volcano, and a glacier, and conclude that Earth's average temperature is 640°C. Whoops! I wanted to "fairly" cover the range of possibilities but turns out my samples weren't representative of the problem space.

They picked something they thought Republicans would be wrong on, and something they thought Democrats would be wrong on. Nothing in the article describes anything other than them designing that balance into their methodology because they believed it already.
posted by traveler_ at 8:46 AM on January 13, 2019 [6 favorites]


Kutsuwamushi: The overwhelming consensus among experts can be wrong, but it's much more likely that the lone iconoclast proposing radical new ideas is wrong.

We live in an interesting time from this point of view. In most periods of history, there has been an advantage for the powers-that-be to support a large class of experts who prove things that aren't true, like, "the emperor is a god" or "society would collapse if we take from the rich to give to the poor" or "every word of this book is literally true" or - to borrow your example - "colonialism brings all the advantages of civilization to backwards peoples".

A quiet skepticism toward well-funded experts - asking yourself "who benefits from this idea?" - has been a useful heuristic for poor and marginalized people. The heuristic breaks down when the experts are, in fact, telling the truth, as in the cases of evolution and climate change.
posted by clawsoon at 8:51 AM on January 13, 2019 [9 favorites]


I'm surprised by all the skepticism toward the idea that people's opinions change to match the perceived identities.

You see it all the time in opinion polling: for example, in 2011 67% of Democrats and only 43% of Republicans wanted an immediate withdrawal of all troops from Afghanistan. But now that Trump is advocating for that position, 76% of Republicans but only 41% of Democrats support such a withdrawal.

That's a big reversal of positions between the two parties, and it seems like it's best explained by people mirroring the opinion of the leaders of their parties.
posted by crazy with stars at 10:21 AM on January 13, 2019 [13 favorites]


On matters of foreign policy most people don't have a lot of expertise. Here's another explanation for this reversal: it's been 7-8 years since that first poll and Trump is a huge lodestone when it comes to support from Democrats. Trump gives Dems such a case of the flaming heebie-jeebies that only 41% of Democrats would agree that water is wet if Trump said it was. There's also nuance that gets lost in these types of poll questions (how many of that 41%, or GOP 76%, are taking issue with the idea of a complete and immediate withdrawal vs some other more measured approach?). I'm not saying you're wrong to conclude that people mirror the opinion of their party leaders (they do!) I'm just suggesting it's not enough to leave it at that, and I'm wary of drawing any kind of conclusions about human nature from a couple of examples.
posted by axiom at 10:44 AM on January 13, 2019 [4 favorites]


You can see similar flips over a few months surrounding an election about other questions; there's not really much doubt that (many) people do follow the leads of partisan elites, especially on less salient issues. I don't know if there have even been serious attempts to disambiguate the social-psychological / "tribal" reasons for this from vaguely-rational informational economizing, though.
posted by GCU Sweet and Full of Grace at 10:58 AM on January 13, 2019 [7 favorites]


I'm surprised by all the skepticism toward the idea that people's opinions change to match the perceived identities.

I'm not skeptical of the idea, and I didn't get the impression that most of the commenters in this thread are skeptical of the idea, either. I see a lot of criticism of treating climate change denial and reliable climate science as though they are the same. My own comment is a criticism of this study, not a refutation of the idea that the study is trying to support.

There is also some discussion about why people's opinions change. My opinion on a recent redistricting proposal was influenced by the fact that my union endorsed it, for example. It was difficult to evaluate on my own, because the details can be tricky. But the fact that my union endorsed it meant it was more likely it would advance the goals of my union, many of which I share. That is useful information.

Republicans use this type of information too, and they're not always wrong to - as much as I disagree with most of their beliefs, knowing that their party has position X on policy Y means that position X is more likely to accord with their beliefs.

This is nuance that goes beyond "tribalism bad." I would love a discussion about how we can hold to account parties that abuse this trust, how we can create a more informed electorate so that such proxies are actually less useful - and yes, how we can lessen political tribalism. (I vote, in part, media coverage treating elections less like playoffs and more like decisions on policy.) I don't particularly love the conflation between reality-based positions and non-reality-based ones.

I also think that you probably can't do this in the context of American politics without making it political, because we know that there are emotional and ideological differences between the parties. Climate science is an example; there has been a decades-long misinformation campaign by Republicans to undermine trust in experts on this issue, and a wider misinformation campaign against a "leftist" academia is related to that. That is going to affect how Republicans evaluate information. False equivalencies are going to obscure a lot of important information about how opinions change to match perceived identities.
posted by Kutsuwamushi at 11:01 AM on January 13, 2019 [9 favorites]


“The point is that we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield."

— George Orwell
posted by darkstar at 11:56 AM on January 13, 2019 [8 favorites]


They saw the image of a man named Robert Linden, a very professional, academic-looking older gentlemen, and they read that he was a professor of meteorology at MIT, that he earned his doctorate at Harvard University, and they then learned he was a member of the American Meteorological Society and the National Academy of Sciences. When Kahan asked these subjects if this gentleman was indeed an expert on global warming, everyone agreed that he was

See, I'd be immediately suspicious because meteorology is not climate and meteorologists are notorious for being climate change skeptics because it's a related enough field that they feel like climate experts even though they aren't.
posted by vibratory manner of working at 1:04 PM on January 13, 2019 [14 favorites]


vibratory manner of working: See, I'd be immediately suspicious because meteorology is not climate and meteorologists are notorious for being climate change skeptics because it's a related enough field that they feel like climate experts even though they aren't.

Is that like how engineers are the people most likely to come up with quack physics theories?
posted by clawsoon at 1:09 PM on January 13, 2019 [9 favorites]


To expand on my previous comment, I think the issue with these experiments is not their conclusions that people have fluid opinions, trust various groups, are easily manipulated, and often rationalize their personal beliefs to match (perhaps lab-created) group beliefs. Rather, the issue with the the implied normative or political implications of these results.

To begin with, the use of the term "tribalism" with its inherently pejorative connotations is problematic. Is believing in the scientific community tribalism? If I manipulate what you understand the scientific community to be saying (eg, by lying to you and claiming some scientist said some false thing) is that "tribalism"? Probably not by these experimenter's lights, but they don't really have a coherent definition of "tribalism" at the ready. And that's evident by the fact that all of them would agree that trusting a political party -- eg, to the point of revising your beliefs if you are told the party believes otherwise -- is an example of tribalism. This entire body of research derives from a long tradition in political science that assumes both US parties are equivalent and largely identity- rather than truth- or even ideology-based. So by those lights, unlike believing scientists, it is taken as self-evident that party affiliation is "tribal" -- something that similarly derives from 50s-era prejudices about nonrational "tribal" peoples.

Of course the whole thing starts breaking down when one party is significantly less factual than the other, and the overlap between a party and a scientific community approaches unity. And we all know the result -- false claims of symmetry that don't match the current political parties' relationship with the truth. But more fundamentally, this reveals a basic incoherence in the entire project starting in the 50s or earlier. People form bonds of trust based on some mix of rational belief and psychological heuristics (if those can even be fully distinguished) and, like all human reasoning, it's easy to show flaws in that process. But what that means -- what its normative or policy implications are -- really depends on whether you call it all "tribalism" or just fallible heuristics. Calling it "tribalism" conjures (pseudo-scientific) images of savage humans with primitive passions, whose beliefs are epiphenomenal to their loyalties, and whose organizations (such as parties) are similarly just gussied-up bigotries. This framing has two effects: first, it make these papers (even after decades of them) seem more important because they are undermining the entire political system, showing it just to be a ruse of our biases. And second, it of course undermines all of politics, implying that parties are symmetrical, arguments are irrelevant, and either it's all about nothing, or the only way to really solve things is via hyper-rational apolitical technocratic centrism.

By contrast, if you interpret these now-commonplace lab results as demonstrating merely that trust is complicated and humans are fallible, well ... First, your paper is much more minor (anathema!), and second, it leaves politics pretty much as its participants already believe: a mixture of facts, reasons, mistakes, biases, and all the rest that goes into any complicated decision or institution. But by the latter lights, there's no reason not to think one organization might not be closer to the truth than the other, and basically the end response to these experiments becomes: So what? Tell us something useful, like heuristics non-experts can use to choose which expert to listen to, or which ideas and mental frameworks to adopt that will let us better sort through the non-empirical world of oughts and norms. As it stands, though, these papers are at best repetitions of stuff we already know, and at worst, quasi-racist reinforcements of a political outlook that has long been repudiated.
posted by chortly at 1:18 PM on January 13, 2019 [10 favorites]


What happens when someone tells you that Elizabeth Warren was a Republican before she became a Democrat?

I'm going to assume this is a joke and not some expression of your belief in my extreme stupidity.
posted by praemunire at 1:20 PM on January 13, 2019


I'm not sure if everyone who commented on the climate change bit quoted in the original post has read the entire transcript, but I would recommend doing so. Ironically, in that transcript, the lead researcher behind the study that mentioned climate change says that climate change is not the topic that he would lead with as an example of this tribal effect. Instead, he mentions HPV versus HBV to show that the controversy over the former was pretty much entirely due to the way that the manufacturer sought fast-track approval, which required the involvement of Congress and thus triggered politicization and tribal/partisan thinking.

Also, one might do well to read this excerpt:
Lilliana Mason: I remember there was a moment during the debates, the presidential debates, when Hillary Clinton said something about ingroup bias. And I think it was Mike Pence who said something like, “How dare you accuse us of being biased,” and just blew my mind. Obviously she wasn’t saying that Republicans are biased, what she was saying is every single human being has this in them. It’s not offensive to say it, it shouldn’t be offensive to say; it’s just natural psychology. What psychologists have found actually, is with something like racial prejudice, for instance, we can fight that if we make ourselves aware of it, and if we don’t pretend to be insulted when we hear it. If we think about how everybody has racial prejudice, it’s in my mind, I know it is, it’s in everyone’s mind, but I’m going to actively fight against it, I’m going to keep the knowledge that it exists in the front of my mind, and try very hard in every interaction to make sure that I am not acting on behalf of that bias, you can fight it.

And so, one of the things that I think could be done here is if partisans admit that they have an implicit bias against their political opponents, it’s possible every single time you see one of your political opponents to you say to yourself, “Hold on a second, I know that I am automatically reacting against this person because of their party. I’m going to remind myself that this is a human being who has a family and a favorite recipe and likes to go roller skating," or whatever it is — give some humanizing detail to that person and actively fight it in your own mind.

[lightly edited to correct some transcription errors]
Everyone has these cognitive biases, including Democrats. And everyone can work on countering these biases by staying aware of them in themselves, not just in others. We're all in this together.
posted by skoosh at 1:22 PM on January 13, 2019 [5 favorites]


chortly: Is believing in the scientific community tribalism?

Sure, why not? For most of us, it is tribalism. We don't do the experiments ourselves. We trust. When scientists bring us new facts, we change our minds. If someone who is explicitly anti-science makes a claim, we immediately discount it.

The replication crisis is instructive in this regard. How many of us said, "Oh, so interesting!" when we first heard about some social psychology effect or another, and then later said, "I never really believed that result" when it failed to replicate?
posted by clawsoon at 1:48 PM on January 13, 2019 [5 favorites]


Hmmm ... could this post and the 'study' it purports to portray ...

... actually be *the real* study ... of how MeFi will react to the idea that tribalism will over-rule reality-based_community?

But wait, there's more: do the results differ among people who've seen "Inception" more than a dozen times?
posted by Twang at 2:21 PM on January 13, 2019 [2 favorites]


Metafilter: people exhibited favoritism toward their imaginary ingroup and bias toward their imaginary outgroup
posted by Enemy of Joy at 2:27 PM on January 13, 2019 [3 favorites]


This entire body of research derives from a long tradition in political science that assumes both US parties are equivalent and largely identity- rather than truth- or even ideology-based.

That’s true on a foundational level, but most of the poli sci stuff I’ve read on this recently squarely blamed the Republicans for driving recent polarization. This isn’t an argument that both sides are the same. It’s just that no one is immune to in-group bias, and it’s always useful to remember that.

Beyond that, there’s ample evidence showing that modern US political parties are very strongly identity-based, and that in fact our ideologies can be heavily influenced by our party affiliation — rather than the other way around. We associate our parties with certain values, and a specific policy associated with one party or another becomes colored by those value associations.

The parties themselves are very different. One party does base its policies on facts far more than the other does. That much is demonstrably true. But that doesn’t mean party affiliation isn’t still a key factor in individual opinions on those issues.
posted by shapes that haunt the dusk at 3:13 PM on January 13, 2019 [2 favorites]


I'm confused because buried in the article is a line specifically saying that the climate change experiment suggested that anti-climate identity groups responses demonstrate politically motivated cognition. That's a pretty firm, asymmetric, non-equivalence assertion, so what's up with the concern about false equivalency?
posted by polymodus at 3:42 PM on January 13, 2019


I think it is difficult to deal with this article for two reasons:
1. the number of issues where Republicans are completely unmoored from facts. Abortion. Climate change. HPV for gods sake. It is very difficult in the current environment to find areas where there is an equivalent example going the other way. Why are we so lopsided? Is there some whole set of issues Republicans are right about that I'm not aware of because I identify with a different group?
2. the solution is always the same. Learn to see the other side as individuals. Treat people you disagree with respectfully. Find common ground. President Obama famously said there are no red states and blue states, just the United States. And he treated the other side respectfully and tried to work with them. Look where we are now. Do they have an explanation for why his approach failed? I would really have appreciated the piece addressing this, and without addressing it, I don't know what I can take away from this that is useful to me.

I believe in group loyalty [I'm not sure tribal is an appropriate term to keep using because it has specific connotations], I'm not trying to be a skeptic. I just don't know what to do with this info.

This line infuriated me:
and then that term — fake news — it mutated from a rebranded way of talking about propaganda to just anything that people wished wasn’t true.

It didn't mutate. It wasn't an accident of nature or an act of god. Donald Trump purposely made it useless as a term because he finds fake news (as in, items that appear to be regular news that are completely fake) very useful.
posted by Emmy Rae at 5:10 PM on January 13, 2019 [2 favorites]


The concern about false equivalency comes from the rest of the article, often in the subtext but sometimes breaking out into overt declarations of superficial balance:
If your group has it right on what scientific consensus is then just count yourself lucky, because you don’t understand what the scientist does in his or her own terms. It just happens to be that your intermediary groups managed to get you the right answer despite the assault that they’re under, and all groups have embarrassing instances where the message they’ve got from their intermediaries is false.
posted by traveler_ at 5:12 PM on January 13, 2019 [2 favorites]


I don’t think that’s a declaration of equivalency, it’s saying that non-experts defer to people they tend to agree with. Which makes sense, but also highlights why it’s important that political leaders not lie to us — because we’re not always qualified to recognize when people we trust are misleading us. And it is not realistic to expect that your party is always right, because you can’t expect that kind of perfection or purity from anyone or anything.
posted by shapes that haunt the dusk at 6:56 PM on January 13, 2019


Er, what I mean is, it’s not always lies, either. Sometimes a politician is motivated by a particular ideology, or they have vested interests in something, or they’re just plain misinformed. Yeah, the Democrats get a lot of things right, but don’t expect perfection from politicians. Policies are often hashed out by competing interests within the group, and what we end up seeing isn’t necessarily the right thing, but what all these different people could agree on. If there’s reason to believe that our opinions can be shaped by our party affiliation (which there appears to be), then it’s important to be aware of that in-group bias in ourselves, even if our party usually gets it right.
posted by shapes that haunt the dusk at 7:06 PM on January 13, 2019 [1 favorite]


Now, what would be interesting is repeating this experiment with scientists who self-identify as Republican. (Or are there not enough of those around to make this empirically meaningful?) Will they align with the Republican group or with the scientist group?

IAAscientist. I think that when I started in the 90s there were a number of Republicans in the physical sciences especially, but as the Republican party became more and more extreme and anti-science the percentage has shrunk to almost nothing.
posted by overhauser at 7:44 PM on January 13, 2019 [8 favorites]


This entire body of research derives from a long tradition in political science that assumes both US parties are equivalent and largely identity- rather than truth- or even ideology-based.

Beyond that, there’s ample evidence showing that modern US political parties are very strongly identity-based, and that in fact our ideologies can be heavily influenced by our party affiliation


These are both true, but incomplete.

There is fer shure research showing that members of the mass public, like you and me, seem to be Democrats and Republicans for the same reason that most people are Methodist or whatever -- because that's how our families brought us up, full stop. Or newer research arguing that members of the mass public, like you and me, pick our party ID by thinking of what Ds and Rs are like and going with the one that most closely resembles whatever self-identification we think is most important.

And that's true, but it's important to remember that the targets of inquiry here are members of the mass public.

If you studied why members of the mass public believe that the Earth goes around the Sun and not t'other way 'round, the answer you would get for 99.whatever percent of people is "Because people I trust as truth-bearers told me so" and not "The following observation disambiguates a heliocentric system from a geocentric system:"

In much the same way that there are very good reasons to believe heliocentrism, but most of us just don't happen to use those reasons, there are good, solid, reasonable reasons why African-American voters are almost monolithically Democrats and why anglo evangelicals are overwhelmingly Republican. Those reasons are founded in the offers and claims made by the parties, in who the parties are willing to treat as decent people, in the issue-positions the parties take, and so on, just like we'd like them to be. It's just that those good reasons don't apply to most Americans, who just get the same party ID they would have gotten from those good reasons from their family and other social upbringing instead, or arguably by re-evaluating their party ID in light of changes in their social identity. But the people studying *THAT* are (mostly) different from the social-psychologists studying individual formation of party ID. You find those people in the coalitions and realignments literature and studying the more institutional side of parties.

So, yeah, parties are identity-based and tribal, but that absolutely doesn't mean they can't also be issue-based at the same time depending on what and who you're looking at.
posted by GCU Sweet and Full of Grace at 8:03 PM on January 13, 2019 [6 favorites]


I'm not sure if everyone who commented on the climate change bit quoted in the original post has read the entire transcript, but I would recommend doing so. Ironically, in that transcript, the lead researcher behind the study that mentioned climate change says that climate change is not the topic that he would lead with as an example of this tribal effect. Instead, he mentions HPV versus HBV to show that the controversy over the former was pretty much entirely due to the way that the manufacturer sought fast-track approval, which required the involvement of Congress and thus triggered politicization and tribal/partisan thinking.
I don't know about anyone else, but I read it and wasn't particularly impressed. Actually the HPV vs HBV thing is another good example of why. From the transcript:
The reason to doubt that though, is that at the same time that we were fighting about the HPV vaccine, the acceptance, the vaccination rate, for the HBV vaccine, Hepatitis B, which is also a sexually transmitted disease, was at 95 percent.
Hep B is a "sexually transmitted disease" in the sense that it can be transmitted sexually, but it can also be transmitted in a whole lot of other ways. Suggesting that it's a "sexually transmitted disease" in a way that's comparable to the genital HPV infections that the HPV vaccine is mainly aimed at and then using that comparison to support a facile "both sides do it!" equivalence is really stretching things.

Another thing about the climate change experiment is that it's effectively, but subtly, using two different meanings of the word "expert". Show someone a name and a list of credentials, and "is this person an expert?" means "does this person have the right qualifications to be an expert?". Show them a few paragraphs of argument and it becomes "does this person actually know what they're talking about?", and that's a very different question.

I do think that the idea that people are influenced by "tribal identification" is a potentially useful one, but it's also an abstraction of a whole pile of underlying drives, heuristics and interests that are very difficult to analyse using this kind of quantitative experimental approach, and following the abstraction too far is going to lead you in the wrong direction. Take this, from the end of the transcript:
If you read an article that says, “Boy the views of conservatives are really bad for us. How do we change their minds?” you’ll never change their minds, because you’re treating them as the problem. It’s also wrong. Because I’m the problem. You’re the problem. We all do this. So the question shouldn’t be how can we change conservatives’ minds. It should be how can we restore the state of science communication environment so that it works on these issues in the same way it works on these other issues. That said, that’s a stake we have in common with the people who disagree with us about the facts.
I would love to believe this, I really would. But I can't. It's fantasy.
posted by A Thousand Baited Hooks at 2:32 AM on January 14, 2019 [3 favorites]


Hep B can be transmitted in more or less the same way that HIV can be transmitted. How many Americans would object to the characterization of HIV/AIDS as a sexually transmitted disease, without air quotes? The other major transmission route in the United States is through IV drug users sharing needles; do you believe that IV drug use is immune to being politicized or stigmatized?

If you read that section again, and ask yourself, what are the researchers' beliefs about the best course of action regarding introduction of the HPV vaccine, what do you come up with? Why do you think that Kahan is making a "facile 'both sides do it!'" argument in this case? Could a reasonable person come up with other interpretations?

Regarding changing those people's minds versus changing science communication: Do you think that you can change a conservative's mind by presenting reasoned arguments from the point of view of a progressive? Maybe put another way: has listening to a conservative present reasoned arguments ever led you to change your mind about some political issue that you already had an opinion about? How common is that?
posted by skoosh at 3:59 AM on January 14, 2019 [1 favorite]


Hep B can be transmitted in more or less the same way that HIV can be transmitted. How many Americans would object to the characterization of HIV/AIDS as a sexually transmitted disease, without air quotes?

Can be, but it's much easier to get Hep B from blood. And of course HIV is a sexually transmitted disease, but even then it's not an *exclusively* sexually transmitted disease in the same way as the kind of HPV that causes cervical cancer, and if there's ever a vaccine for HIV it probably won't be aimed primarily at 11 and 12 year old girls like the HPV vaccine was.

If you read that section again, and ask yourself, what are the researchers' beliefs about the best course of action regarding introduction of the HPV vaccine, what do you come up with? Why do you think that Kahan is making a "facile 'both sides do it!'" argument in this case?

I don't know what they think the best course of action is. They do say this:
David McRaney: The makers of HPV vaccine sought early approval, and they also sought to make it mandatory. Now, early approval means debate in Congress. Mandatory means debate in state legislatures. Both means that people with zero scientific knowledge raised questions about why this was a mandatory vaccine for girls instead of boys. The public then first learned about the HPV vaccine by watching reports on MSNBC and Fox News where the message was framed as a moral issue, which made that an us-versus-them issue, which made it a tribal issue.

Dan Kahan: And anything that’s before the legislature is just kind of raw meat for the conflict entrepreneur groups on both sides of these issues, the right and left. And it turned into a question of: Whose side are you on and who are you? And it just it blew up in everybody’s face. So that was a decision to take an issue that normally travels down this path where people are able to recognize what science knows regardless of their identities and put it right on the track to become one of the sad issues where we have this tension between being who you are and knowing what’s known by science.
Now, I'm not American and I've never watched MSNBC, but I find it very difficult to believe:

(a) that there was anything MSNBC could have done, or not done, to prevent Fox News and the rest of the rightwing media from making it into an us-versus-them issue;

(b) that it would have become an us-versus-them issue without Fox News and the rest of the rightwing media making it one, with some help from their friends in politics.

Given that asymmetry, I don't know what their solution is supposed to be other than "Fox News should stop being nasty", which is not very helpful. If their solution is "keep important things away from democratic decision-making processes", or maybe "MSNBC should stop being rude to people who are okay with women getting cancer as long as it's a punishment for having sex when they were teenagers" that's something, I guess, but they don't actually say this and I don't see any reason to think that it would address the problem, anyway.

Regarding changing those people's minds versus changing science communication: Do you think that you can change a conservative's mind by presenting reasoned arguments from the point of view of a progressive? Maybe put another way: has listening to a conservative present reasoned arguments ever led you to change your mind about some political issue that you already had an opinion about? How common is that?

Depends on what the argument is about. It's not possible to make a reasoned argument for something like climate change denialism, so not in that area, no. Same for things like Trump's wall, no-deal Brexit, the Crimean annexation, the continued destruction of the Amazon, the Adani coal mine, etc etc. The problem is that when rightwingers are actually able to make a reasoned argument for something there's usually someone on the centre or left who is already making it, and doing a better job.
posted by A Thousand Baited Hooks at 5:12 AM on January 14, 2019 [1 favorite]


You see it all the time in opinion polling: for example, in 2011 67% of Democrats and only 43% of Republicans wanted an immediate withdrawal of all troops from Afghanistan. But now that Trump is advocating for that position, 76% of Republicans but only 41% of Democrats support such a withdrawal.

I would argue that these are two different positions. This issue has huge pros and cons and very high stakes in terms of lives affected so people might support a competently handled withdrawal but not support an incompetently (or malicious) handling.

Then there's the question of people's motives in answering. Some of the 2011 supporters might actually have been motivated by wanting to put pressure on Obama to find a solution while they have no illusions about Trumps willingness and capabilities in doing so and would prefer he leaves things alone.
posted by patrick54 at 6:57 AM on January 14, 2019 [2 favorites]


This might say more about black and white thinking than it does about tribalism, because it shows the reaction towards the opposite end of a spectrum that was designed around the polarization of two parties, not five parties.
posted by Brian B. at 7:01 AM on January 14, 2019


According to the CDC, "Among adults in the United States, Hepatitis B is most commonly spread through sexual contact and accounts for nearly two-thirds of acute Hepatitis B cases." But the real issue here is not whether Hep B is "really" a sexually transmitted disease, but whether it can be plausibly characterized as one for the purposes of fomenting a moral panic over teen sex. I believe that it can. However, it wasn't, and the researchers lay out why they think it wasn't: "The difference is that people learned about the HBV vaccine from their doctors. It wasn’t politicized. The HPV vaccine, however, they learned about probably by watching MSNBC and Fox News, where that message was it’s us versus them again." That happened, in their view, because of the way the vaccine manufacturer chose to pursue FDA approval, which brought it before the attention of Congress and the cable news networks, instead of information about the vaccine's existence and benefits being disseminated through sources (like one's doctor) that don't frame it in in-group/out-group terms.

The ultimate solution here is, don't get your science information (or really, any of your news) from MSNBC or Fox News. The fact that these two channels are seen as (and see themselves as!) Our Team's and Their Team's 24-hour cable news networks is the main problem. I am American, and I avoid watching either.

Regarding reasoned arguments: I think you're mistaking a reasoned argument for a convincing one, which of course would render my question trivially true by definition. Rather, I would say that a reasoned argument is one that deploys logic and statements of fact to dispassionately advance a thesis. It's entirely plausible to argue, for instance, that Crimea has historically been part of Russia, that it was only ceded to Ukraine about sixty years ago by the dictate of a Soviet autocrat, that it is populated mostly by Russian-speaking people who think of themselves as ethnically Russian, and that the vast majority of the Crimean population would prefer to be part of Russia than of Ukraine, so therefore it's fine for Russia to annex Crimea. Of course, it's also entirely plausible to argue that Russia's opportunistic military occupation of Crimea is a blatant violation of international law, that the referendum conducted to legitimize the annexation cannot be considered free or fair since it was conducted in haste under that aforementioned occupation, that Crimea is integral to Ukraine's security and territorial integrity, and that Crimea should therefore be returned to Ukraine immediately. One may find one or the other of these arguments convincing (or neither), but I think that they can both be characterized as reasoned arguments nevertheless.

Let me just say that this is only an example, and I don't want to get bogged down in the details of Eastern European politics. But I do believe that the distinction between a convincing argument and a reasoned one is valuable, and should not be elided.
posted by skoosh at 10:16 AM on January 14, 2019 [2 favorites]


has listening to a conservative present reasoned arguments ever led you to change your mind about some political issue that you already had an opinion about? How common is that?

Personally, it happens quite a bit, because there are some pretty smart conservatives out there. That said, I'm Canadian so perhaps the consequences aren't as dire from my immediate "tribe" should I dare to not hold with the party line.
posted by philip-random at 11:17 AM on January 14, 2019 [2 favorites]


I don't know, that question is hard to answer. I'm surrounded by conservatives so that's my social mileu that I'm in; I find that I try in various instances to accomodate their more centrist or status-quo views, which is a different thing than me changing my opinion into theirs. It's a key distinction.
posted by polymodus at 12:47 PM on January 14, 2019 [2 favorites]


I think the Canadian experience may be different, because there has been considerable flux and multiple parties at both the provincial and federal levels. Five different party leaders debate each other (in two languages) before every federal election, and my sense is that ordinary people who aren't party activists don't necessarily identify as members of one particular party or feel that they need to vote accordingly.* Thus, as you know, we saw incredibly dramatic fluctuations in party vote share and seats won in the last two elections – in 2011 for the NDP (going from 12.0% to 33.44% of seats in the House of Commons), and in 2015 for the Liberals (going from 11.04% to 54% of all seats). Those results came from double-digit increases in the percentage of the electorate voting for each of those two parties.

So it seems apparent that many, many people in Canada felt comfortable switching their votes to different parties (especially if it meant getting the Tories out of power). This is much rarer in the United States, because there are only two major parties and they've been the top two parties for over 150 years, so entire families have been either Republicans or Democrats literally for generations. Even people who aren't registered as members of either party tend to vote for one of them over the other most of the time. The Democratic Party's 41-seat gain in the House in 2018 resulted from a gain in vote percentage of less than 5.5%.

*Is this an accurate assessment of how the Canadian electorate views political parties and elections? Also, how many Canadians self-identify as members of a particular party?
posted by skoosh at 7:11 PM on January 14, 2019 [1 favorite]


I think that's a fair assessment, skoosh. You could extend it even further by looking at the regular rise of new parties in Canada who are able to win significant numbers of seats federally: CCF+NDP, Social Credit, Reform, Bloc, maybe Bernier's brand new People's Party or the Greens in the upcoming election. The new Conservative Party isn't even the old Progressive Conservatives.

The Liberals have stayed together (and some wag pointed out that the Liberals ruled Canada for as much of the 20th century as the Communists ruled Russia), but conservatives especially haven't been able to keep together their fragile alliances. That's partially because the usual recipe for conservative victory in Canada involves getting Quebec and Alberta to vote together. That's as difficult an electoral trick as you'll find anywhere. They are not natural allies. Quebec conservatives and Alberta conservatives are not members, you might say, of the same tribes.

It would be an interesting historical experiment to re-do Canada with an elected, equal-by-province, and effective+powerful Senate. (Triple-E, in the old Reform Party phrase.) Would that give rural conservatives enough of a consistent taste of power to keep them together, as it (maybe?) does in the U.S.?
posted by clawsoon at 7:55 AM on January 15, 2019 [2 favorites]


I think it is very unfortunate that scientific issues become partisan issues.
That right wing people are becoming less likely to be involved in scientific study is a self fulfilling prophesy. It becomes possible for the right wing to rhetorically disregard all of scientific endeavour as a left wing pursuit.

Coping with anthropocentric climate change should not be a political football, it's simply practical.

I found the podcast about 'tribal' thinking very interesting, but it would be helpful if they had thought about the use of the word tribal for this phenomenon. In fact, it is the presenter David McRaney who uses that phrase over 25 times in the presentation, whereas it is only used once by any of the people they are talking to. That is to point out that it is also referred to as 'cultural cognition'. So I think we can lay the blame on David McRaney.
posted by asok at 3:18 PM on January 15, 2019 [1 favorite]


Here's another podcast which is fantastic, although the sound level is low as it is a chat in a kitchen.

'To put it simply, Misha is an expert on communication, and people pay him to help them communicate better. In our long, wide-ranging conversation, you’ll pick up a zillion nuggets of wisdom that will help you the next time you set out to negotiate, facilitate, or solve shared problems with people through conversation.'

Worth it for Misha Glouberman doing the introduction!
posted by asok at 4:33 PM on January 18, 2019


« Older Police killed an unarmed man-and this time the...   |   Lloyd’s of London presents Brainstorm: Coming Soon... Newer »


This thread has been archived and is closed to new comments