When in doubt, shout!
February 14, 2011 6:21 AM   Subscribe

A list of warning signs that your opinions function more to signal loyalty and ability than to estimate truth. (Previously)
posted by anotherpanacea (98 comments total) 44 users marked this as a favorite
 
You would post something like this.
posted by Potomac Avenue at 6:22 AM on February 14, 2011 [16 favorites]


And this is such a typical whiny liberal copout: question the "fairness" of the debate rather than coming up with answers to hard questions or defending yourself with LOGIC.

Hello Ladies.
posted by Potomac Avenue at 6:30 AM on February 14, 2011 [1 favorite]


You care far more about current nearby events than similar distant or past/future events.

Why cross this out VS just delete it?

The events of the past ARE just that, and are less important for one's personal life than what is going on now....

(yet past events do matter - they'd matter more if the REST of mankind act when past events are shown to be a problem.)
posted by rough ashlar at 6:39 AM on February 14, 2011


Doesn't everyone do these things sometimes? And does anyone ever recognize that they're doing them, when they're going them?
posted by Diablevert at 6:40 AM on February 14, 2011 [1 favorite]


Being sort of a jerk, I enjoy the occasional recreational argument.
posted by electroboy at 6:41 AM on February 14, 2011 [14 favorites]


Interesting as food for thought in terms of when one is arguing against a brick wall. The focus is on the moment of conflict, and truth might take a backseat to truthiness when the heat is on. I know for myself that I can get as narrowly opinionated as is possible, but I'll entertain opposing points that seem solid - well after the heat of argument has cooled. I can do this in the moment too, depends on the level of heat the exchange brings.

Later I'll tear into the next guy who disagrees with my newly tempered opinion.
posted by drowsy at 6:42 AM on February 14, 2011


I care more about your second comment than your first one, Diablevert!

Actually, the point is that we all do this all the time. It's *very* hard to catch yourself at it, even or especially if you know to look for it. This is why it helps to belong to a community that is trying to overcome their cognitive biases, a community to which I am FANATICALLY LOYAL!
posted by anotherpanacea at 6:43 AM on February 14, 2011 [11 favorites]


I'm wishy washy, act before I think, and have a number of other flaws.

On the other hand, I want to know the truth more than anything.

I'm relieved not to recognize myself in the list.
posted by MikeWarot at 6:43 AM on February 14, 2011 [2 favorites]


You find it hard to be enthusiastic for something until you know that others oppose it.

With a slight modification, this one makes perfect sense: You find it hard to be enthusiastic for something until you know that others support it.

I hear some new proposal. It sounds good, but I haven't thought through all the ramifications. But some others I know, who do their homework well, have good things to say. I start getting enthusiastic.

Now imagine those homework-doing others I know are evil and I know that when they oppose something that sounds good, it's probably as good as it sounds.
posted by DU at 6:44 AM on February 14, 2011 [1 favorite]


You find it easy to conclude that those who disagree with you are insincere or stupid.

How could you not be stupid for disagreeing with me?
posted by Joe Beese at 6:49 AM on February 14, 2011 [9 favorites]


With a slight modification, this one makes perfect sense: You find it hard to be enthusiastic for something until you know that others support it.

I read the mefi comments before the link
posted by a robot made out of meat at 6:50 AM on February 14, 2011 [19 favorites]


DU: I think these are thorny issues, since they come pretty close to the Bayesean ideal, which is that your opinions are or ought to be information for me in helping me formulate my own opinions.

some others I know, who do their homework well, have good things to say.

Right. If you didn't do this, then, "Your opinion doesn’t much change after talking with smart folks who know more."

imagine those homework-doing others I know are evil

The problem here is in know that someone is evil: "You find it easy to conclude that those who disagree with you are insincere or stupid."

In general, I think you're coming pretty close to this: "You are reluctant to agree a rival’s claim, even if you had no prior opinion on the topic." Again, if it makes you less likely to believe that the earth is round that Hitler believed that too, you're not really truth-tracking, are you?
posted by anotherpanacea at 6:50 AM on February 14, 2011


How could you not be stupid for disagreeing with me?

You win this round, Beese.
posted by electroboy at 6:50 AM on February 14, 2011


It's *very* hard to catch yourself at it, even or especially if you know to look for it. This is why it helps to belong to a community that is trying to overcome their cognitive biases, a community to which I am FANATICALLY LOYAL!

Eh, I don't think they can be overcome, really. Flaws in the wiring are flaws in the wiring. Although I suppose it's worthwhile to try and deliberately make yourself more open-minded.
posted by Diablevert at 6:53 AM on February 14, 2011


I don't think they can be overcome, really. Flaws in the wiring are flaws in the wiring.

I think of it analogously to optical illusions. On the one hand, I'll never convince my eyes that the lines are the same length or whatever, but I can use other methods to prove that my eyes have a habit of lying to me in this regard. Then I can try to be on the lookout for that kind of mistake.

On the other hand, errors will always creep in somewhere. So we design institutions and systems to be error-resistant. Not fool-proof, but with wide tolerances. You notice that democracy tend to punish unpopular minorities, you give them constitutional protections. You notice that the stock market tends to go down in June, you sell in May and go away. You notice that white people have implicit racial associations and men have implicit sexist associations, you make sure that important decisions are made by a multiracial and gender-proportioned group of people.
posted by anotherpanacea at 7:02 AM on February 14, 2011


1) Only conservatives do this, right?

2) MetaFilter: Being sort of a jerk, I enjoy the occasional recreational argument.
posted by ZenMasterThis at 7:04 AM on February 14, 2011 [2 favorites]


I'll figure out how I feel about this after I see what all of you think.
posted by Shepherd at 7:10 AM on February 14, 2011 [2 favorites]


I don't see these has huge problems in the context of an internet discussion, since I think these sorts of things are assumed, and may just be part of being human.

But for me, I almost immediately attributed these sort of motivators to people in government. "You find it hard to be enthusiastic for something until you know that others oppose it." This is a defining characteristic of Congress, yes?

I think this list would be better adapted to that: A place where people are actually able to flush the country down the toilet in real ways, simply to take a meaningless stand against "the other side of the aisle".
posted by y6y6y6 at 7:10 AM on February 14, 2011


#1 throws me for a loop. I always thought of myself as an argumentative, abrasive, sort of person, but are there really people out there like that? Because if so then, dang, I'm not nearly as bad as I thought I was.

I do, however, agree with DU on the inversion of #1. Take climate change. I'm not a climatologist, my opinion on the topic is completely irrelevant because I don't know anything beyond the most basic, layman level, facts about climatology. I'm passionate about the topic precisely because people who do know about the issue have formed a pretty strong consensus that climate change is happening.

#6 Im not at all sure I think is valid, or at least not universally so. Opinions near the middle aren't necessarily right, and on some issues they are pretty much cop outs. Example: either the Holocaust happened, or it didn't. Someone who wants to take a middle ground position on that issue is, I think, being foolish. I disagree, strongly, with those who would argue that the Holocaust did not happen, but I'd hold them in higher esteem than someone who argues that the truth is probably somewhere in the middle and maybe it happened but it probably wasn't as bad as everyone claims.
posted by sotonohito at 7:11 AM on February 14, 2011


I see what you're saying but I prefer to take a couple of hours and read up on things. I think you'll find that the reasonable position is to agree with the consensus of climate scientists. Same with the Holocaust. Evidence exists, get familiar with it.
posted by Potomac Avenue at 7:15 AM on February 14, 2011 [1 favorite]


I think, though, that the Overcoming Bias crowd mostly want to overcome errors in individual thinking; whereas you're approaching it from a systemic point of view. That's the better way IMO, to build robustness against error into systems than expecting individuals to accomplish it alone.

Given some of the recent evidence from neuro and cognitive science (too lazy to look up pertinent threads) I guess I just don't agree with Hanson et al. that there's much hope of untying opinions from deeply held moral intuitions and emotions.

...

I thought this was the most interesting comment from the OB thread:

Signaling ability with some of these methods is dangerous, sure, because there is usually some connection between it and correspondence truth values, but loyalty seems of a different piece. It is a value on its own, one that is, especially in unimportant situations, genuinely more valuable than the correspondence between an opinion and reality. It is even possible that in some well-placed instances, loyalty is more important than the accuracy of any possible opinion. Suppose we only take the thinnest possible claim, there still doesn’t seem to be any reason to group “signaling ability” and “signaling loyalty” as a related pair. Any insight into this choice?
posted by r_nebblesworthII at 7:17 AM on February 14, 2011 [2 favorites]


Again, if it makes you less likely to believe that the earth is round that Hitler believed that too, you're not really truth-tracking, are you?

But I have no information on any consistent bias wrt Hitler vis-a-vis the shape of the Earth.1 That's why rejecting as evidence his opinion on the roundness of the Earth makes sense. Whereas I do have information on a consistent bias wrt Hitler vis-a-vis other topics. By subtracting this bias from his opinion-signal, I can track the truth better than nothing.

1 For the purposes of this example, I mean. In reality, though.
posted by DU at 7:21 AM on February 14, 2011


Or, put another way, I have noticed a correlation that certain people have with the truth. Some of these correlations are positive, some are negative. Now I have a statement S of unknown truth value, but I do know what each of these people thinks it is. I maintain that I have better than average probability of guessing the truth value of S.
posted by DU at 7:24 AM on February 14, 2011


You are uncomfortable taking a position near the middle of the opinion distribution.

This is a weird entry on the list. It seems like an indirect endorsement of the argument from moderation (the "false middle"). Compromise is an enormous part of being a functioning adult, but you have to know when and why to compromise.

But I have no information on any consistent bias wrt Hitler vis-a-vis the shape of the Earth.1 That's why rejecting as evidence his opinion on the roundness of the Earth makes sense.

But the question isn't about rejecting Hitler's mere opinion as to the Earth's form. The question is about whether you would take Hitler's opinion as to the Earth's form as negative evidence - as evidence that the Earth is some other form.
posted by Sticherbeast at 7:27 AM on February 14, 2011 [1 favorite]


I see what you're saying but I prefer to take a couple of hours and read up on things. I think you'll find that the reasonable position is to agree with the consensus of climate scientists. Same with the Holocaust. Evidence exists, get familiar with it.

I would disagree with that pretty strenuously. The types of evidence are vastly different. There's reams of documentary evidence and first person accounts of the Holocaust, whereas climate change is a collection of data from a wide variety of sources which can only be reliably interpreted by someone with significant expertise in the field.

A layman can make a convincing argument that the Holocaust actually did occur, which you can't really say for something like climate change.
posted by electroboy at 7:28 AM on February 14, 2011 [4 favorites]


More?
posted by infini at 7:28 AM on February 14, 2011


More.
posted by Sticherbeast at 7:30 AM on February 14, 2011


I just don't agree with Hanson et al. that there's much hope of untying opinions from deeply held moral intuitions and emotions.

I've heard Hanson on a podcast totally dissing the individualistic response to bias. He basically says this is impossible to do on your own or even in a small community like an academic department or university. He has a couple of nice zingers criticizing "critical thinking" for instance, which is something I teach myself. You can't boil all this down to a list of fallacies and biases to watch out for, which is unfortunate.

His solution (and I've corresponded with him about this) is to institute prediction markets or use services like Intrade. People who do this will still make mistakes, but they're more likely to update on new information more quickly, and there will be less loyalty-signalling and more "honest" profit-seeking. I'm not sure prediction markets are sufficiently well-tested for the tasks he wants to use them for, but the idea is sound and early evidence is good.

Now I have a statement S of unknown truth value, but I do know what each of these people thinks it is.

As I said, on positive terms, this is exactly the right thing to do. Smart knowledgeable people ought to change our minds, unless we're horribly biased. The problem is that you've got to be able to consistently recognize smart knowledgeable people, and most of us only tend to recognize such smart knowledgeable people when they already agree with us!
posted by anotherpanacea at 7:31 AM on February 14, 2011 [5 favorites]


Who's going to show this to Glenn Beck?
posted by londonmark at 7:31 AM on February 14, 2011


"You find it hard to be enthusiastic for something until you know that others oppose it."

This one's kind of strange. Why would you be enthusiastic about something if you assumed it was uncontested? Everyone loves hugs, right? I grew up hugging people, it's an everyday sort of thing. I certainly never had a vigorous discussion about the virtues of hugs and what they mean to me until the MetaFilter side-hug thread, where I learned that people actually do oppose hugging.

I'm also not going to city hall to support my right to chew gum unless they're considering banning it, etc.
posted by explosion at 7:32 AM on February 14, 2011


Thanks, I'll give that a listen. I only intermittently follow the OB blog and it struck as me as mostly focusing on problems with individual reasoning. I suppose there's some kind of bias at work there...
posted by r_nebblesworthII at 7:33 AM on February 14, 2011


I disagree, strongly, with those who would argue that the Holocaust did not happen, but I'd hold them in higher esteem than someone who argues that the truth is probably somewhere in the middle and maybe it happened but it probably wasn't as bad as everyone claims.

As with almost every issue, there are people who take extreme positions on both sides of the Holocaust in terms of its severity. It probably is not the case that the most extreme estimations of the number of deaths are accurate, and that the answer does lie somewhere in a very broadly-defined middle (though almost definitely way more on one side than the other). It may actually be the case that the person who just unthinkingly splits the difference is more accurate than the people who are taking the maximalist position on it -- if the actual number (just making these up) is say 4 million deaths, and someone on one side claims 10 million and someone on the other side says zero.
posted by empath at 7:34 AM on February 14, 2011


You are uncomfortable taking a position near the middle of the opinion distribution.

In politics and religion right now, there seems to be a fashion for claiming the middle ground for the purpose of scolding everyone else. I've not found the scolds to be especially more open-minded or interested in actual dialog.

You are uncomfortable taking a position of high uncertainty about who is right.

Uncomfortable for which reasons? I'm uncomfortable with the problems of moral philosophy and uncertainty about addressing environmental crises because those are issues that impact what we do next. I can't not act in these matters, and that uncertainty should be worrisome.
posted by KirkJobSluder at 7:40 AM on February 14, 2011 [1 favorite]


My opinion on this will be the opposite of whatever Shepherd eventually decides.
posted by kyrademon at 7:43 AM on February 14, 2011 [1 favorite]


In a forum like this it's kinda hard to ascertain some of these things. I know in my case, hyperbole and "truthiness" are employed by necessity 'cause there's no way I can type something in detail and with the back-and-forth that a face-to-face conversation would entail.

Facts are facts - but conclusions are much more malleable. I personally try to live by the metric that that what people think is less important than that they think. If I get the impression that someone is just parroting a position, then all bets are off.
posted by Benny Andajetz at 7:44 AM on February 14, 2011 [1 favorite]


But the question isn't about rejecting Hitler's mere opinion as to the Earth's form. The question is about whether you would take Hitler's opinion as to the Earth's form as negative evidence - as evidence that the Earth is some other form.

No, I wouldn't. Because, as I said, I (for the purposes of this example) don't know that Hitler was worse than average at knowing the shape of the Earth. Whereas I do know that Hitler was worse than average at evaluating other statements, i.e. he is more likely to be wrong than right. Therefore disbelieving a statement of his on those topics merely because he says it is true would make me right more than half the time.
posted by DU at 7:49 AM on February 14, 2011


He went to Delphi and boldly asked the oracle to tell him whether... there was anyone wiser than I was, and the Pythian prophetess answered that there was no man wiser...

When I heard the answer, I said to myself, What can the god mean? and what is the interpretation of this riddle? for I know that I have no wisdom, small or great. What can he mean when he says that I am the wisest of men? And yet he is a god and cannot lie; that would be against his nature. After a long consideration, I at last thought of a method of trying the question. I reflected that if I could only find a man wiser than myself, then I might go to the god with a refutation in my hand. I should say to him, "Here is a man who is wiser than I am; but you said that I was the wisest." Accordingly I went to one who had the reputation of wisdom, and observed to him - his name I need not mention; he was a politician whom I selected for examination - and the result was as follows: When I began to talk with him, I could not help thinking that he was not really wise, although he was thought wise by many, and wiser still by himself; and I went and tried to explain to him that he thought himself wise, but was not really wise; and the consequence was that he hated me, and his enmity was shared by several who were present and heard me. So I left him, saying to myself, as I went away: Well, although I do not suppose that either of us knows anything really beautiful and good, I am better off than he is - for he knows nothing, and thinks that he knows. I neither know nor think that I know. In this latter particular, then, I seem to have slightly the advantage of him. Then I went to another, who had still higher philosophical pretensions, and my conclusion was exactly the same. I made another enemy of him, and of many others besides him.

After this I went to one man after another, being not unconscious of the enmity which I provoked, and I lamented and feared this: but necessity was laid upon me - the word of God, I thought, ought to be considered first. And I said to myself, Go I must to all who appear to know, and find out the meaning of the oracle. And I swear to you, Athenians, by the dog I swear! - for I must tell you the truth - the result of my mission was just this: I found that the men most in repute were all but the most foolish; and that some inferior men were really wiser and better. ...

This investigation has led to my having many enemies of the worst and most dangerous kind, and has given occasion also to many calumnies, and I am called wise, for my hearers always imagine that I myself possess the wisdom which I find wanting in others: but the truth is, O men of Athens, that God only is wise; and in this oracle he means to say that the wisdom of men is little or nothing; he is not speaking of Socrates, he is only using my name as an illustration, as if he said, He, O men, is the wisest, who, like Socrates, knows that his wisdom is in truth worth nothing. And so I go my way, obedient to the god, and make inquisition into the wisdom of anyone, whether citizen or stranger, who appears to be wise; and if he is not wise, then in vindication of the oracle I show him that he is not wise; and this occupation quite absorbs me
.
posted by empath at 7:52 AM on February 14, 2011 [4 favorites]


Therefore disbelieving a statement of his on those topics merely because he says it is true would make me right more than half the time.

Hitler is dead and all of his positions have already been established. Let's try a different example: Sarah Palin. Should Sarah Palin's position on something you otherwise have no opinion on make you change your mind?

("Otherwise no opinion" is difficult, but imagine it's something non-political: whether Agassi or Federer was a better tennis player in his prime, or choose something even more trivial that you don't care about if you do care about that.)
posted by anotherpanacea at 7:56 AM on February 14, 2011 [1 favorite]


No, I wouldn't. Because, as I said, I (for the purposes of this example) don't know that Hitler was worse than average at knowing the shape of the Earth. Whereas I do know that Hitler was worse than average at evaluating other statements, i.e. he is more likely to be wrong than right. Therefore disbelieving a statement of his on those topics merely because he says it is true would make me right more than half the time.


A couple of things -- I'd suggest that the vast majority of things that Hitler (and everyone else who has ever lived) believed, you also believe, just by virtue of being human. People just never talk about them, because they are unspoken assumptions about the nature of reality -- things like cause and effect, that we exist, etc.

Secondly, I read Lubos Motl's physics blog all the time. As far as I know, he has a very good reputation as a physicist and does a good job of explaining new physics papers without dumbing down the terminology, and when he says a new paper is no big deal, or wrong, it usually turns out to be the case.

However, he also believes that climate change is a scam and that Obama was probably born in Kenya. Would I know more about physics if on account of that, I just believed the opposite of what he said about string theory just because he said it?
posted by empath at 7:58 AM on February 14, 2011 [2 favorites]


Should Sarah Palin's position on something you otherwise have no opinion on make you change your mind?

Yes, it should when and only when I have some information on how right Sarah Palin normally is on that topic.

I don't understand why this is so difficult or controversial. Imagine I just flipped a coin. I won't let you see it, but I show it to Sarah Palin. She tells you it is heads. Should you believe her or disbelieve her? You have no information so you can't choose. Now I tell you she lies 75% of the time. What guess, heads or tails, should you make to maximize your chances of getting it right?
posted by DU at 8:03 AM on February 14, 2011 [7 favorites]


Hitler is dead and all of his positions have already been established.

Ohh, that's a problematic issue here because we don't really have access to Hitler's positions. We have historical and literary interpretations, which is a different beast all together.

Let's try a different example: Sarah Palin. Should Sarah Palin's position on something you otherwise have no opinion on make you change your mind?

It depends on the issue and the opinion in question.
posted by KirkJobSluder at 8:04 AM on February 14, 2011


I know that question's not directed at me, but I'd say you may well know more about physics if you disbelieved him (or at least downgraded the probability of string theory being true) based on his climate change and Obama conspiracy theory since those opinions are suggestive of his abilities at a scientist, i.e. to draw the best conclusions based on facts that are easily discovered.
posted by r_nebblesworthII at 8:05 AM on February 14, 2011


don't know that Hitler was worse than average at knowing the shape of the Earth

Looking at this again, I think I see the problem: you're already avoiding the error, which is that you're being that a person's bad judgments don't necessarily correlate with other biases. So a person who is bad on politics may be untrustworthy on other political issues, but not on tennis or the shape of the earth. So you agree with Hanson.

The original line you were commenting on is tied to the purposes of communication: if you onlly oppose or support things for the fun of arguing them, you'll tend to take positions in opposition to others just so we can have something to talk about and assemble into pro- and anti- teams. But there are a lot of things that are true but don't have a pro- and anti- team ready-assembled. If you're interested in truth, you ought to try to believe true propositions about those banal things, too. If you're interested in signalling loyalty, you'll likely ignore those issues. (For instance, I was only able to become interested in string theory when I read an anti-string theory paper. That makes me a bad researcher, at least in this regard.)
posted by anotherpanacea at 8:07 AM on February 14, 2011


Warning sign 1: you are a human being, not an economist's model of one.
posted by escabeche at 8:08 AM on February 14, 2011 [4 favorites]


I don't understand why this is so difficult or controversial. Imagine I just flipped a coin. I won't let you see it, but I show it to Sarah Palin. She tells you it is heads. Should you believe her or disbelieve her? You have no information so you can't choose. Now I tell you she lies 75% of the time. What guess, heads or tails, should you make to maximize your chances of getting it right?

All jokes and snark aside, Sarah Palin does not lie 75% of the time, and she almost certainly doesn't lie about coin flips where she has nothing to gain or lose either way. Palin doesn't live in one of those Smullyan-esque logic problems where one tribe always lies and one tribe always tells the truth.

I know that question's not directed at me, but I'd say you may well know more about physics if you disbelieved him (or at least downgraded the probability of string theory being true) based on his climate change and Obama conspiracy theory since those opinions are suggestive of his abilities at a scientist, i.e. to draw the best conclusions based on facts that are easily discovered.

I can't comment on the viability of string theory, but many otherwise brilliant scientists/engineers/etc. do indeed hold crackpot beliefs outside the realm of their expertise. William Shockley, for one. It would be nice that good intellectual habits would carry over into all the parts of one's life, but people are frighteningly good at compartmentalizing their better and worse parts.

It may actually be the case that the person who just unthinkingly splits the difference is more accurate than the people who are taking the maximalist position on it -- if the actual number (just making these up) is say 4 million deaths, and someone on one side claims 10 million and someone on the other side says zero.

It may actually be the case that "consulting the bones" could lead to an accurate result, but it's not a good method for determining facts like this. Six million murdered Jews, give or take, is an accepted number among the vast majority of historians, and they have arrived at this number through accepted historical methods. Someone saying ten million would be obliged to show his or her homework on the matter. Someone saying zero would be obliged to show an absolutely astonishing amount of literally world-changing homework on the matter. Pitching the number between the two figures does nothing to evaluate either claim's veracity.
posted by Sticherbeast at 8:20 AM on February 14, 2011 [2 favorites]


...a person's bad judgments don't necessarily correlate with other biases. So a person who is bad on politics may be untrustworthy on other political issues, but not on tennis or the shape of the earth.

Probably not. Although I may have some theory about their bias, which could let me make predictions for topics they nominally haven't covered before. Like, someone's political affliation/rhetoric might let you make some guesses on how they'd view charity or libraries or religion questions, etc. Obviously you aren't guaranteed to be right, but it could be an evidence point (depending on what your theory about their bias is).
posted by DU at 8:21 AM on February 14, 2011


Federer > Agassi and it's not even close.
posted by Potomac Avenue at 8:37 AM on February 14, 2011 [2 favorites]


I can't comment on the viability of string theory, but many otherwise brilliant scientists/engineers/etc. do indeed hold crackpot beliefs outside the realm of their expertise. William Shockley, for one. It would be nice that good intellectual habits would carry over into all the parts of one's life, but people are frighteningly good at compartmentalizing their better and worse parts.


Okay, sure. But I don't have access to the truth about whether Schockley is compartmentalizing, or actually is just bad at science. From an outsider's perspective, his crackpot beliefs suggest he is untrustworthy on scientific matters (even though he was apparently brilliant and invented the transistor or something). Since I can't get inside his head, that heuristic is all I have to rely on; it's an evidence point as DU calls it.
posted by r_nebblesworthII at 8:39 AM on February 14, 2011


Like, someone's political affliation/rhetoric might let you make some guesses on how they'd view charity or libraries or religion questions, etc. Obviously you aren't guaranteed to be right, but it could be an evidence point (depending on what your theory about their bias is).

Again, this is all consistent with basic Bayeseanism. We're not disagreeing, except in the following way: you have to have a good reason to believe that we can pick smart knowledgeable people accurately and make accurate judgments as to the scope of their expertise or competence. We have to be good at catching other people's biases. And perhaps we are: most people on Metafilter are probably better at this than people off of it, for education and predilection reasons.

But we've got blind spots. We overuse heuristics. And most people, even Mefites, probably overuse the "enemy/rival" heuristic: once we've identified a rival, we're more likely to attribute bad motives or incompetence to them, and to discount their opinions in a broader scope than we would a friend or stranger. Hell, we're doing it right now even by using Sarah Palin as an example, because we both kind of believe that she'd be more likely to get the coin toss wrong even though neither of us would admit it. Hell, I basically compared her to Hitler by substituting her here!

Given this tendency, the only good protection is some systematic method for catching ourselves when we make this mistake and forcing a correction. Markets, personal data dumps, stickk.com, free speech, fair elections, juries, you name it: we're all trying to find the right method to catch and prevent errors, with varying degrees of success. But the first step is recognizing that we have a problem.

Federer > Agassi and it's not even close.

Well, that's what David Foster Wallace said, so you're probably right.
posted by anotherpanacea at 8:43 AM on February 14, 2011 [2 favorites]


Another problem with "splitting the difference" is that both extremes are often defined by qualitatively different ideological stances, with the middle being closer to one stance. So it's not just a matter of 10 million vs. 0, but a historical perspective that views antisemitism as a key part of Nazism vs. the Holocaust as a hoax that was manipulated by Jewish and/or Communist interests. People adopting the 6 million figure tend to support the view that it did happen, it was violently antisemitic, and it was a horrible thing.

The same is true of anthropogenic climate change. The middling and high estimates come from the perspective that humans are changing the climate. The low estimates come from the perspective that we are not.
posted by KirkJobSluder at 8:49 AM on February 14, 2011


1. You find it hard to be enthusiastic for something until you know that others oppose it.

...

#1 throws me for a loop. I always thought of myself as an argumentative, abrasive, sort of person, but are there really people out there like that? Because if so then, dang, I'm not nearly as bad as I thought I was.

I'm not pro-abortion, but when I learned everything done by people to take away women's access to abortion, I got more enthusiastic about supporting it. Is that so wrong to be motivated by need, i.e. supporting at-risk institutions and practices that are under attack?

I realize that's not exactly what the point of #1 is. It's claiming you should make your decisions independently (based on evidence and rational analysis) without worrying who or how many people are for/against something.

But there is a counterpoint, I think. Unpopular issues need their supporters to speak louder and do more because there are less people to do the work and more people to inform.

It's like, let's say I really like old pinball machines. I don't need to be a collector or anything, because there are already plenty of people in my area who do that sort of thing and make their machines available to people to try, etc.

However, if there was an anti-retro-pinball-machine coalition out there vandalizing machines and trying to pass laws making them illegal, etc., I'd get a lot more involved.

Then again, you don't need to have censors to support free speech, etc.

In long, I found the list to be a pretty good checklist, but a bit simplistic. (Nice supporting links, too.)

You find it hard to list weak points and counter-arguments on your positions.

This is a great one. Like the interviewee's answer of "Sometimes I can be a perfectionist" to the (bad) question of "What are your weaknesses."

I admit it: If Sarah Palin told me that the 1984 Detroit Tigers were the greatest baseball team of all time, I admit my support for that argument would falter a bit. I have no problem with that.
posted by mrgrimm at 8:50 AM on February 14, 2011 [1 favorite]


You are uncomfortable taking a position near the middle of the opinion distribution.

> This is a weird entry on the list. It seems like an indirect endorsement of the argument from moderation (the "false middle"). Compromise is an enormous part of being a functioning adult, but you have to know when and why to compromise.

> #6 Im not at all sure I think is valid, or at least not universally so. Opinions near the middle aren't necessarily right, and on some issues they are pretty much cop outs.

I'm not sure why this one is controversial. I understand this as a way of saying that people who are 'signaling loyalty' can be prone to all-or-nothing or black-and-white thinking and reluctant to consider "moderate" positions (even if there are valid arguments or evidence for them) because they would be seen as making concessions to the other side. This is not the same as saying that 'positions near the middle of the opinion distribution' are always right.

> In politics and religion right now, there seems to be a fashion for claiming the middle ground for the purpose of scolding everyone else. I've not found the scolds to be especially more open-minded or interested in actual dialog.

"Moderate" positions aren't necessarily wrong either, just because they're moderate. People taking them may in fact be convinced that they're right, and not necessarily just posturing.
posted by nangar at 8:57 AM on February 14, 2011


Pitching the number between the two figures does nothing to evaluate either claim's veracity.

That wasn't my point. My point was that it would be no worse than taking an extreme position blindly.
posted by empath at 8:58 AM on February 14, 2011


The real long and short of it is this:

Are you willing to admit you are wrong: to an individual face-to-face, to many people in public, and most importantly, to yourself?

If there's one litmus test I've used to gauge conversations, friends, etc., it's that one.

I know some people who, even when faced with direct contrary information, will not admit that they were wrong. They will dissemble, make excuses or extenuating explanations, distract, or simply disengage. These are not gray-area issues either; they are often of the "Dustin Hoffman was in Star Wars" variety. I find those people completely unbearable.

I want to tell them to go home, look in the mirror and say, "Sorry, I was wrong," 20 times a day. (It's not that hard!) Instead I just avoid and/or end the acquaintanceship.
posted by mrgrimm at 8:58 AM on February 14, 2011 [6 favorites]


I know that question's not directed at me, but I'd say you may well know more about physics if you disbelieved him (or at least downgraded the probability of string theory being true) based on his climate change and Obama conspiracy theory since those opinions are suggestive of his abilities at a scientist, i.e. to draw the best conclusions based on facts that are easily discovered.

Well, then you'd have to throw out almost all of mainstream theoretical physics because of a stupid political opinion that one person has. That seems illogical.
posted by empath at 9:01 AM on February 14, 2011 [1 favorite]


Also, I personally love Andre Agassi (my friend and I had a long-standing battle in high school over who would be better, Agassi or Michael Chang. HA!), but he's not in the same league as Federer.

It's Federer v. Laver v. Sampras, imo. Honestly, if I have to take one of them (in their prime), I would take Pete Sampras. I would love to see Sampras-Federer in their primes. ... but what would Laver be able to do with today's rackets?

posted by mrgrimm at 9:01 AM on February 14, 2011


Well, then you'd have to throw out almost all of mainstream theoretical physics because of a stupid political opinion that one person has. That seems illogical.

It's not an either-or thing - I could downgrade my opinion of how likely string theory is to be true without throwing it out completely (assuming all I know about string theory is based on that one guy).
posted by r_nebblesworthII at 9:03 AM on February 14, 2011


nother problem with "splitting the difference" is that both extremes are often defined by qualitatively different ideological stances, with the middle being closer to one stance. So it's not just a matter of 10 million vs. 0, but a historical perspective that views antisemitism as a key part of Nazism vs. the Holocaust as a hoax that was manipulated by Jewish and/or Communist interests. People adopting the 6 million figure tend to support the view that it did happen, it was violently antisemitic, and it was a horrible thing.

The only way to establish the truth of a statement is through evidence, not the ideology or motivations of the person proferring it.
posted by empath at 9:03 AM on February 14, 2011 [1 favorite]


I answered "no" to all twenty. But then I expected that because I'm just the best at holding opinions and everyone else is an uninformed fool.
posted by Decani at 9:04 AM on February 14, 2011 [2 favorites]


The When in Doubt, Shout study is fascinating.
posted by mrgrimm at 9:07 AM on February 14, 2011


(I keep responding to holocaust examples for some reason. I don't want people to think I'm a holocaust denier or anything, i'm just playing devil's advocate with a provocative example.)
posted by empath at 9:07 AM on February 14, 2011


nangar: I'm not sure why this one is controversial. I understand this as a way of saying that people who are 'signaling loyalty' can be prone to all-or-nothing or black-and-white thinking and reluctant to consider "moderate" positions (even if there are valid arguments or evidence for them) because they would be seen as making concessions to the other side. This is not the same as saying that 'positions near the middle of the opinion distribution' are always right.

The problem is that "moderates" often take moderation into the realm of black-and-white thinking. Take as examples Rosenbaum's agnostic manifesto in which Rosenbaum's agnosticism is built on accusations of bad behavior on the part of atheists, or Evan Bayh's sniping at left-wing Democrats.

"Moderate" positions aren't necessarily wrong either, just because they're moderate. People taking them may in fact be convinced that they're right, and not necessarily just posturing.

No one suggested that and being convinced that you're right isn't the problem addressed by the article.
posted by KirkJobSluder at 9:24 AM on February 14, 2011


But I don't have access to the truth about whether Schockley is compartmentalizing, or actually is just bad at science. From an outsider's perspective, his crackpot beliefs suggest he is untrustworthy on scientific matters (even though he was apparently brilliant and invented the transistor or something). Since I can't get inside his head, that heuristic is all I have to rely on; it's an evidence point as DU calls it.

You do have access to at least some of that truth, however. You know that Shockley was a professor at Stanford and is a recipient of the Nobel Prize, Comstock Prize, and the wonderfully-named Oliver E. Buckley Condensed Matter Prize. You know that he was one of the inventors of the transistor and you know that he was the godfather of Silicon Valley. That is as solid evidence as you will ever see in your lifetime that someone is "good at science," just as it is solid evidence that he was able to be so good at science while also being a cantankerous, alienating, thoroughly racist crackpot.
posted by Sticherbeast at 9:25 AM on February 14, 2011 [4 favorites]


Looking at Shockley's argument from the wikipedia, I see an interesting response to Shockley's conclusions stated by Edgar G. Epps. He argued that "William Shockley's position lends itself to racist interpretations".

It's a bit of a logical fallacy, but not exactly. It's an unfair trap, as he can now be blamed for actions of someone else, who may or may not have heard anything Shockley said. One could say that about Darwin, which frequently has been done, or a plan to add birth control to Medicare or some other social service, or even something like Dungeons and Dragons or World of Warcraft.

I've seen this strategy before in arguments with people who have these signal loyalty tactics. It's an odd sort of cover for discouraging discussion through shame by association with certain topics that may be misused to justify other actions. It severely degrades the opposing argument, while not actually directly criticizing the data and conclusions he presented. So most anything that is said after that point would be associated with the worst case scenario. Although I don't agree with Shockley's conclusions about what to do with the gene pool of humanity, I find that stifling discussion of scientific research because of possible interpretations is a slippery slope. Should it be considered upon releasing the findings? Absolutely, but the chilling effect to the scientific community is worrisome.

Is Epps' statement about what could happen with Schockly's conclusions in the wrong hands? Is it about the Schockly's intent? Is it about a disagreement with the conclusions? Is it a reaction to a taboo subject? The reader will probably pick the one that offends them the most. It could be any of these, AND a dessert topping, too.

The fact that the quote is out of context is important as well is part of why I dislike quoted statements like these. Epps could have a totally different meaning than the quote, or had more specific clarification as to what the real meaning of what he meant. Quotes, taken without context is as multipurpose to the reader's reactions as as what he said. Like above, the reader will probably pick the one that offends them the most, and go with that, as I did when I read it and wrote this post, focusing on the stifling of scientific discussion, rather than a judgment of whether Shockley was wrong, coming from a racist perspective to being with, or any of a dozen other things.

I feel like I'm chasing my tail now, as by the end of my thought I found I was doing the a same thing I was complaining about. I was eager for a real-world example of a statement/argument/fact/belief to work with and Shockley's research has a lot of potential, as it involves political, religious, scientific, and historical hotbuttons that can set off a lot of signal-loyalty arguments.
posted by chambers at 9:27 AM on February 14, 2011


You do have access to at least some of that truth, however. You know that Shockley was a professor at Stanford and is a recipient of the Nobel Prize, Comstock Prize, and the wonderfully-named Oliver E. Buckley Condensed Matter Prize. You know that he was one of the inventors of the transistor and you know that he was the godfather of Silicon Valley. That is as solid evidence as you will ever see in your lifetime that someone is "good at science," just as it is solid evidence that he was able to be so good at science while also being a cantankerous, alienating, thoroughly racist crackpot.

It sounds like you're right about this... but I believe what I said even more now!
posted by r_nebblesworthII at 9:29 AM on February 14, 2011 [2 favorites]


I just want to clarify that I meant the research has a lot of potential as a model to examine arguments, not that I agree with the conclusions he came to from his research.
posted by chambers at 9:33 AM on February 14, 2011


empath: Perhaps ideology is the wrong word there. My point is that you can't even begin to get an estimate of Holocaust deaths involving millions of Jews unless you adopt the historical theory that Nazi Germany was violently anti-Semitic and created institutions for the mass murder of Jews and other ethnic groups. The difference between 6 and 10 million is one of degree not kind.

This is generally true of historical genocides. Beancounting the exact number of deaths is difficult, especially when you consider early mortality due to brutal economic policies leading to famine and disease in targeted populations. But people who disagree on the estimates usually agree that genocidal practices were a historical fact.

Another example are estimates of the age of the Universe. Once you're measuring the age of things in billions of years rather than human generations, you're in an entirely different theoretical framework. 6 billion and 13 billion share a deep-time perspective that can't be reconciled with young-earth creationism.

The so-called middle ground between these positions isn't numeric, it's an attempt to reconcile irreconcilable worldviews. The middle ground between Holocaust denial and Holocaust history isn't 6 million, it's something along the lines of: The Holocaust was a horrible thing, the importance of which has been greatly exaggerated. The middle ground between YEC and deep-time science isn't 6 billion years. It's old-earth creationism and intelligent-design perspectives.

And of course, as synthesis positions this "middle ground" often inspires a fair degree of closed-minded loyalty and contrarianism. A willingness to take the middle ground doesn't necessarily signal reasonableness.
posted by KirkJobSluder at 9:50 AM on February 14, 2011 [1 favorite]


All jokes and snark aside, Sarah Palin does not lie 75% of the time, and she almost certainly doesn't lie about coin flips where she has nothing to gain or lose either way. Palin doesn't live in one of those Smullyan-esque logic problems where one tribe always lies and one tribe always tells the truth.

Are you sure about that? There's actually some data to support this notion.

But more seriously, of course it's a reasonable assumption that Sarah Palin doesn't behave like an automaton, tending toward dishonesty across all categories of subjects due to some internal bias toward dishonesty. But it's not a necessary assumption, as there are individuals who suffer from particular psychological conditions that compel them to dishonesty (pathological liars, that is). It's not a priori knowable whether Palin is or isn't a pathological liar, versus someone who's ideologically blinkered, or working a particular political angle for personal gain while otherwise acting more or less in good faith. I've always wondered if true pathological liars might not also suffer from an increased tendency toward self-deception.
posted by saulgoodman at 9:55 AM on February 14, 2011


Assuming that Sarah Palin is literally a pathological liar is an enormous leap. Not every dishonest person, or even habitually dishonest person, is a pathological liar, at least according to the definitions I've heard.

My armchair layman's diagnosis, for the very little that it is worth, is that she is a malignant narcissist.
posted by Sticherbeast at 10:05 AM on February 14, 2011 [1 favorite]


Yeah, well, you know, that's just, like, your opinion, man.
posted by pyrex at 10:08 AM on February 14, 2011 [2 favorites]


A willingness to take the middle ground doesn't necessarily signal reasonableness.

You aren't disagreeing with anyone. What the article said is that: An unwillingness to take the middle ground *does* signal unreasonableness. The contrary does not necessarily follow at all.
posted by Potomac Avenue at 10:10 AM on February 14, 2011


Palin is an interesting example for me. If she said "My name is Sarah", I would tend to disbelieve her until I could check it out.

The fact that she ideologically repugnant to me doesn't help her cause, but that's absolutely not the reason I have a kneejerk distrust of what she says. Like I said above, it's not because of what she thinks, but that she has demonstrated to my satisfaction that she doesn't think. She's a parrot, and, in my mind, deserves to be treated like one.
posted by Benny Andajetz at 10:12 AM on February 14, 2011 [2 favorites]


The so-called middle ground between these positions isn't numeric, it's an attempt to reconcile irreconcilable worldviews

What confounds the problem even more, is that in some cases, both sides can look at the data and believe that they have proved their argument.

For example, the sun vs the earth as the center of the universe, 15th-16th century style.

Without the necessary equipment to observe with proper detail, and only using the evidence available at the time, one could ask the question at then "What would the universe look like if the earth were the center of the universe?" Well, exactly the same as it looks with the Sun as the center. Of course, we now understand why that wouldn't work, but when you only have the naked eye as a tool, the movements were calculated, with known aberrations of Mercury and Mars, and the model Europe had was complicated, but it worked for the most part. Only when the telescope came in as a tool, and the moons of Jupiter were seen by anyone who had one, could the observable facts pile up and discount an Aristotle's crystal spheres and an Earth-centric universe. Of course the implications were farther than just that, if Aristotle was wrong about that, what else was he wrong about? And some of the strongest resistance was the fear of all the other traditional doctrines that the Catholic church based on Aristotle's ideas, could they be wrong? Then you have anarchy and the world you knew doesn't make sense! Then we are back to the fear of your worldview being wrong is a greater fear than what the argument is, and the the elephant in the room.

In the same way, only when we have a mountain of technological advancements and the ability to actually go further out from this planet, and actually show people what the universe is really like, will those with absolutely entrenched beliefs about the nature of the universe slowly adapt to what the universe actually is.

In regards to the Holocaust, as much data and evidence for it occurring must be preserved as possible, which of course has been ongoing since the event. The future generations will take care of the rest, for good or ill. Those who deny it today are doubtful to change, and those that document that event are getting the numbers and stories as accurately as possible so that future generations will have an easier time arguing than we ever will.
posted by chambers at 10:22 AM on February 14, 2011


His solution...is to institute prediction markets or use services like Intrade. People who do this will still make mistakes, but they're more likely to update on new information more quickly, and there will be less loyalty-signalling and more "honest" profit-seeking. I'm not sure prediction markets are sufficiently well-tested for the tasks he wants to use them for, but the idea is sound

That's not at all clear to me. In fact, I would venture to suggest that a prediction market is, at least in some circumstances, far likelier to amplify bias than reduce it, because of keynes's beauty contest effect --- in other words, it's a situation in which you guess correctly when you guess the result that the majority also guesses.

All of these attempts to bludgeon your way through to absolute unbiased truth on an individual level are a case of opening the box with the crowbar you will find inside it....
posted by Diablevert at 10:30 AM on February 14, 2011 [2 favorites]


Potomac Avenue: You aren't disagreeing with anyone. What the article said is that: An unwillingness to take the middle ground *does* signal unreasonableness. The contrary does not necessarily follow at all.

Actually now that I re-read it, it's "You are uncomfortable taking a position near the middle of the opinion distribution." And that strikes me as just a weak, argumentum ad populum. I'm uncomfortable with the middle of the population distribution on gay rights which can be summarized as: no marriage, limited domestic partnerships, and protection from discrimination. Comfort with the middle ground should depend a lot on exactly what kind of middle ground is being advocated.
posted by KirkJobSluder at 10:46 AM on February 14, 2011


The premise here rubs me the wrong way here a little, in that it feels like trying to satisfy our relatively modern need to quantify things that aren't necessarily quantifiable.

Sure, you can improve the odds of determining whether something is factual or not if you just plug the available information into the correct equation. But concensus can lead you in the wrong direction, too. If you were to poll everybody about, say, why was the Battle of the Alamo fought, the median (moderate) responses and beliefs would be wrong.

Don't forget, either, that many of the world-changing ideas that we accept today were devised by heretics and outliers. There is no replacement for doing your own homework if something is important to you. Anything else is posing, really.
posted by Benny Andajetz at 10:47 AM on February 14, 2011 [2 favorites]


But, KirkJobSluder, that’s the whole point. The argument isn’t that it’s unreasonable to refuse to take the middle ground on any one specific position, it’s that it’s unreasonable to refuse to take the middle ground on any position at all -- no matter what the position is.

What’s being discussed is not the reasonableness or unreasonableness of any give “middle ground” position; obviously, some of those are unreasonable. What’s being discussed is a near-pathological dislike of not being on one extreme side of any argument, a state that comes into being when someone refuses to acknowledge that any opponent they might have might have any point or justification for any of their positions.
posted by kyrademon at 11:07 AM on February 14, 2011


What’s being discussed is a near-pathological dislike of not being on one extreme side of any argument, a state that comes into being when someone refuses to acknowledge that any opponent they might have might have any point or justification for any of their positions.

My understanding of the point the list is making is that it's not so much that one dislikes the middle position because one does not like to be closer to one's opponent, but the extreme positions more easily demonstrate adherence to one's own faction. Vociferous support of the edge position therefore signals integrity and fidelity to the most inside of the in-group to which one is pledging, and power to the overall group set -- i.e. I have enough resource (energy, time, passion, money) to expend some portion of it on this demonstration of my overall value to the set (I care a lot about this, and so do you) and power within it (I care a lot more than some of you).

This behavior exists without relation to the particular validity of any given position, which means it also happens in relation to beliefs that are demonstrably mainstream or uncontroversial. The actual belief doesn't matter; it's the hoisting of the tribal flag that is important, and in some ways the proving of one's worthiness to bear the standard. They're best understood not as statements of opinion but as plays of power and status to the coded culture. Any opponent to which one is reacting is merely a foil or mirror against which to demonstrate one's virtue more easily.
posted by Errant at 11:36 AM on February 14, 2011 [3 favorites]


You can't boil all this down to a list of fallacies and biases to watch out for, which is unfortunate.

From what I recall reading, it's actually even worse than that.

One bias people have is that, when in a debate, they subject the arguments of their "opponent" to more scrutiny than their own arguments. For example, when we see a flaw in our own arguments, we want to fix it so that we can reach the same (or at least a similar) conclusion via a similar (or at least another) argument which lacks the identified flaw. When we see a flaw in our opponents' arguments, we want to declare them wrong, declare ourselves the winner, and quit.

So what happens when smart but still imperfect people learn about all these common biases in the human mind and fallacies in human arguments? We get to see lots more flaws in everybody else's arguments! Any reduction in the rest of our biases is then overwhelmed by the increase in the "argument from fallacy" fallacy.

...

Postscript: First I wrote the above two paragraphs using third person plural pronouns rather than first person plural. Then I thought to myself about how sad it was that there are many classic types of bias right on this very page, yet none I wanted to point out for fear of causing offense. Only finally did I realize that I'd just written the perfect meta-example myself: the first draft of my comment about biases about biases was itself grossly biased.
posted by roystgnr at 11:50 AM on February 14, 2011 [4 favorites]


Any reduction in the rest of our biases is then overwhelmed by the increase in the "argument from fallacy" fallacy.

It's possible to overstate these effects, but there's definitely data that people selectively engage in "motivated skepticism" (pdf) when it comes to counter-arguments and evidence.
posted by anotherpanacea at 12:04 PM on February 14, 2011 [1 favorite]


> [Hanson's] solution (and I've corresponded with him about this)

17. You are especially eager to drop names when explaining positions and arguments.
posted by Marla Singer at 12:48 PM on February 14, 2011 [1 favorite]


Nice.
posted by anotherpanacea at 2:06 PM on February 14, 2011


I was just joshing around. Was that not clear? :(

Let's see what Joe Beese thinks. You'd have to be an idiot to not agree with Joe Beese.

posted by Marla Singer at 2:24 PM on February 14, 2011


I understood you were joking! But it's important to recognize that we all fall into tear traps, even people who are aware of the trap. One of my particular faults is to use proper names as a shorthand for explanations (Foucaultian approach to history, Hegelian solution to the is-ought gap, Socratic method, Aristotelian virtues) so you made a nice catch!
posted by anotherpanacea at 2:41 PM on February 14, 2011


What’s being discussed is a near-pathological dislike of not being on one extreme side of any argument, a state that comes into being when someone refuses to acknowledge that any opponent they might have might have any point or justification for any of their positions.

The list isn't framed that way, which is why I question the inclusion of that criteria. And I think it's rubbish from a social psychology perspective on groupthink where one of the seminal case studies centers on how engineers were pressured to moderate their opinions of solid rocket booster performance in cold weather.

And that's not getting into the problem that "discomfort" is ambiguous and vague. I feel discomfort at a rock in my shoe, I feel discomfort with the lack of a solution to the AIDS crisis, and I feel discomfort reading self-identified agnostics. That's three very different things. Only one of those things is about signaling loyalty, and as the post suggests, that's not always a bad thing.
posted by KirkJobSluder at 2:41 PM on February 14, 2011


Tear = these
posted by anotherpanacea at 2:41 PM on February 14, 2011


Something tells me that the people who should read this list will never get around to reading this list.

So here's my question: is this signaling bad? It's certainly framed in a way to get people riled up -- loyalty vs. truth, WAKE UP SHEEPLE -- but the social animal necessarily discerns power structure and pertinent strategy during interaction with the group. That is the whole point of game theory.

This list, these arguments, don't have anything to say about whether that strident guy over there is "right" or "telling the truth"; what they are saying is that being right and telling the truth are not the paramount concerns. We can easily hear that as "they'll lie for benefit", but I'm not sure truth or lies really come into it, except inasmuch as one is willing to use the coded hyperbole of the group to elevate their own fidelity to a paragon level. That doesn't mean they don't believe what they're saying, it just means they might not believe it as much as they appear to claim to, and in the gap between appearance of claim and actual belief is the game-theoretical "move".

But why do we interpret stridency as testimonial to strength of belief? We have these signaling cues built into us; we recognize instinctively the expenditure of resource and react to it the way any social creature does. Some of us observe the trappings of power and move in that direction, some of us observe the opportunity to stake a claim to power and move in the opposite direction, summoning their own coteries. Other people claim to reject both poles and move in another direction, thus following the essential multidimensionality of the power relationship: resource, and territory, whereever one can find it. Why else does MSNBC rebrand itself as the progressive alternative to FNC, and subsequently CNN claims the moral authority of the neutral objectivist? Power, resource, and territory, on an amorphous semiotic grid, virtually all the time. We're lucky to find little bits of truth as often as we do, if we even do, amidst all that.

But I don't think that makes this signaling "bad", nor does it indicate that one following the coded conventions of game-theoretical signaling is disingenuous. They're just playing a different game, and in many ways a superior one to the pursuit of truth: at least in this game, they can tell when they're winning.
posted by Errant at 3:53 PM on February 14, 2011


my question: is this signaling bad?

Bad only in the fact that it results in a stalemate, where neither is able to to agree with the other, nor have an understanding of the other's position. Why?

Because each side is having a completely different argument and both of them are not fully conscious of it. For the one who uses loyalty signaling, their mind has joined ranks with the not just point of discussion in question, but almost every value and lesson that person holds dear. An attack on one part of the system flags the whole system as being under attack. The argument is no longer about one single issue, but a need to defend or justify their entire worldview. The one who doesn't use those flags in an argument with someone who does, is on a fool's errand, and will just end up with both sides believing that compromise is impossible with the other side. It reinforces a negative stereotype of the opposition for both sides. Both sides are spinning their wheels, and not much good will come of it.

It is possible to arguments be productive, and persuasive for both sides. It requires a change in tactics for the non-loyalty signs side, carefully laying a bit of groundwork at the beginning so that the loyalty signal side can separate a single concept for discussion, and not start off from a protected, defensive position.This allows a level of abstraction to work with for both sides, and helps reduce the miss-communication and increase the chance for both to come away with at least a modicum of understanding of the opposing side.

It's not about tricking or conning the other into 'winning', it's about both parties actually on the same field of play, where the ideas and merits of either side can actually get to compete. Otherwise arguments of these kinds will be about as effective as a jousting match where both the horses and the riders are blind. Look at arguments on most TV news debates. Fun to watch for awhile, eventually frustrating, and everyone goes home unsatisfied.

This process isn't quick and easy, it's certainly not a magical cure for disagreements, but knowing and recognizing these signals is a very helpful tool for having a good back and forth argument.

You rarely see discussions like that on TV, as it's too risky for either side to even show the slightest sign of agreement in an abstract sense, requires a lot of active listening by the viewer, and is very sound bite unfriendly, but when it works, real, actual discussion can happen. It's way easier to do this on the web in forums like MetaFilter, where one can flesh out what their points and reasoning are. Additionally, a user here has all these other members, moderators, and MetaTalk, to keep things from going off the rails.
posted by chambers at 5:55 PM on February 14, 2011 [1 favorite]


chambers: I guess the question is how do you have a pluralistic society without limited forms of loyalty signaling, especially when it comes to issues where the truth value of a proposition is largely irrelevant.

Let's use genre communities as a relatively neutral frame of reference. Science fiction fans share a belief that science fiction is valuable. Mystery fans share a belief that their fiction is valuable. How they define that value differs within and across the community, but as a general case, that's what unites them. Furthermore, that value is largely subjective and intersubjective (socially constructed.) There's no body of evidence that allows us objectively define value.

What loyalty signaling does is establish a common frame of reference within a community of practice. This isn't necessarily hostile to outsiders, (although with communities of resistance, such hostility is likely justified). So this allows for more advanced conversations to take place. (Metafilter does it.)

Because we're talking propositions that are largely subjective and intersubjective here, it strikes me as perfectly legitimate to say, "I'm a science-fiction fan. I reserve the right to define that on my own terms." In a pluralistic discussion, it should be reasonably possible to say "I am a ___" and yet have a civil and productive discussion in which that statement of identity is taken for granted as a starting point.
posted by KirkJobSluder at 6:44 PM on February 14, 2011 [1 favorite]


To grind this axe further down, one of the things I find extremely frustrating is discussions that run along the lines of:

A) "I'm ..."
B) "No you're not, because ..."

Note that B is often launched as something of a pre-emptive strike. The discussion really can't go anywhere after that beyond beanplating individual meanings of subjective and intersubjective statements of identity.

Having done sexuality education for a few years which includes sitting in front of an audience and letting people who never met a queer ask you stupid questions, the best discussions went something along the lines of:

A) "I'm ... "
B) "What do you mean by that?"
C) "How does that relate to your politics/religion/family/relationships ... ?"

(Even better when we had people who disagreed with each other on a panel.)
posted by KirkJobSluder at 7:51 PM on February 14, 2011 [1 favorite]


All jokes and snark aside, Sarah Palin does not lie 75% of the time, and she almost certainly doesn't lie about coin flips where she has nothing to gain or lose either way. Palin doesn't live in one of those Smullyan-esque logic problems where one tribe always lies and one tribe always tells the truth.

No, but she most certainly is part of a tribe.

In general people express opinions based on their own self-interest. This self-interest may not be hostile to my well-being. However, when it's been demonstrated that someone's self-interest may mean they are hostile to me, it would be stupid of me to ignore that fact.

That's not being close-minded or illogical. That's basic threat evaluation. Truth comes after survival, unless you're a robot.
posted by winna at 9:06 PM on February 14, 2011


Truth comes after survival, unless you're a robot.

Or a romantic. (Funny how the extremes meet sometimes.)
posted by saulgoodman at 9:30 PM on February 14, 2011


So here's my question: is this signaling bad?

Doesn't it depend on whether we're allowing ourselves to be deceived about important facts? Here's an example you probably recognize: many conservative Christians have to disbelieve in global warming because that's how they signal their commitment to a loving benevolent God. But they won't say that up front: they'll claim there's not enough evidence to decide, or that scientists are biased, or that

But there are also examples you would probably quibble with, just like the conservative Christian climate-skeptic would. I don't even want to list the things that you and I probably believe as a way of signalling loyalty and ability, because people get fighty when their loyalty-sustaining beliefs are questioned. But here are some things that I think I believe too strongly: that there's definitely *not* a God, for instance. (Rather than *probably* not.) That a college education is good for almost everybody. (A belief I'm growing suspicious of as the evidence mounts.) That Obama is basically a well-meaning progressive liberal forced by political necessity to keep Guantanamo open and lower taxes on the rich. (Ok, I'm pretty agnostic on this question anymore; but I used to believe it strongly!) That we can stimulate our way out of the recession. That deficits don't really matter.

I don't really want to start a fight on liberal platitudes that mirror conservative ones. So even here I'm cautious about which of my beliefs I submit as possibly over-estimated in truth value given what I know about bias and heuristics. So that's a cost to loyalty-signaling. I think you're right: it doesn't have to matter, except when the loyalty-signalers end up voting or purchasing or making other important decisions under conditions that amount to self-deception: the elections and decisions that emerge won't be as likely to be accurate in those cases. So it doesn't have to matter, except when the truth matters. Often, the truth doesn't matter, much. But sometimes it does! Yet we usually stride into those truth-making moments with the same commitments and attitudes we have when we're showing people we like them by parroting their opinions.
posted by anotherpanacea at 7:23 AM on February 15, 2011


Doesn't it depend on whether we're allowing ourselves to be deceived about important facts? Here's an example you probably recognize: many conservative Christians have to disbelieve in global warming because that's how they signal their commitment to a loving benevolent God. But they won't say that up front: they'll claim there's not enough evidence to decide, or that scientists are biased, or that

Sure, on the other hand, my father has a belief in the beauty of ecclesiastical English Handbell music which can also be identified as a loyalty-signaling belief. Is that necessarily a self-deception given that the values at question are subjective and intersubjective?
posted by KirkJobSluder at 7:52 AM on February 15, 2011


Now you're asking whether a person can be wrong about their aesthetic values. Clearly, that's a tough question, but I think we can at least make sense of the claim that one could be wrong about such things. I mean, you probably can't be wrong about your preferences: your father is the best judge of what he likes. But since liking is a complicated judgment, it's possible that your father is wrong about some of the constituent parts of his liking, the evaluations of (for instance) the originality of a composition or mastery of a performer, and in that sense self-deceptive. But there are other ways in which a particular preference could be mistakes as well.

A little background: there are two kinds of error theories. The standard kind of error theory simply tries to explain characteristic biases and mistakes, and is generic across the disciplines. You need an error theory to explain the way the meniscus effects measurements of liquid volume, for instance.

But in philosophy, and specifically in value theory, there's an error theory that says that judgments about some matters are mistaken merely because they believe that there's a fact of the matter when there really isn't. If there really isn't a fact of the matter about the best kind of music, then believing that your preferences are justified is itself a kind of error. It ignores the contingency of those preferences, which could just as well have been otherwise. And it ignores the universality of aesthetic preferences: by liking something, I implicitly believe that it is likeable, that others ought to like it as well, and that they are mistaken when they disagree.

I personally argue that this second kind of error theory is used overmuch. On my view, there is always a truth-maker for any judgment, when that judgment is correctly formulated. (And when it's incorrectly formulated, then we're in error until we formulate it correctly.) So for instance, we might be wrong to say that a kind of music is objectively the best, but not incorrect to say that because we grew up with it, this is the music we enjoy the most.

But in many cases, just because I'm wrong doesn't mean that the truth matters very much. I rather suspect this is the case with ecclesiastical English Handbell music, although I'm reminded that many fans of instrumental music also evince a horrible snobbery when it comes to other people and the music they prefer. So if your father is like Theodor Adorno, whose love of certain kinds of music caused him to detest and harshly judge the emerging genre of jazz and all who participated in it, then yes, I think he could be wrong. But notice, it's not the preference itself that goes wrong: it's the incorrect judgment about the scope of other things that must be true for our preferences to be justified.
posted by anotherpanacea at 9:17 AM on February 15, 2011


anotherpanacea: And it ignores the universality of aesthetic preferences: a) by liking something, b) I implicitly believe that it is likeable, c) that others ought to like it as well, d) and that they are mistaken when they disagree.

It's the gap between b, c, and d that I'm finding to be problematic here. I don't like the music of Bela Bartok. I understand the music of Bela Bartok enough to recognize what he's doing in his string quartets. I like Arvo Pärt and Jackson Pollock, but I understand that many people don't. I like sex with men. Heterosexual men and lesbian women are certainly not mistaken if they say they don't.

The fact that many people do improperly jump from a) and b) to c) and d) doesn't mean that descriptive statements of preference and identity demand prescriptive statements as well.
posted by KirkJobSluder at 10:10 AM on February 15, 2011


So it doesn't have to matter, except when the truth matters. Often, the truth doesn't matter, much. But sometimes it does! Yet we usually stride into those truth-making moments with the same commitments and attitudes we have when we're showing people we like them by parroting their opinions.

Certainly, and I don't think we're disagreeing on the relative value of truth or truthfulness in proposition. It just strikes me that the arguments presented frame the conflict as loyalty vs. truth, when it seems much more accurate to say loyalty vs. doubt. We're conditioned by post-Enlightenment thought to think of doubt and skepticism as necessary governors on the search for truth; we value the ability of a person to question their own beliefs and look askance at someone who does not seem to. But I think it's too easy to take proof of doubt as proof of good-faith argument. Someone can doubt their propositions, come to a conclusion, and be wrong, someone can wholly and stridently align with their propositions without questioning and be right (for given values of right and wrong, of course, which are possibly outside the scope of this debate). But we don't tend to believe that someone is right, or I guess I should say forthright or righteous, without proof of doubt, and we mistrust any argument that seems to come from a place of absolute certainty -- at least, those arguments that don't align with our own, of course.

I guess I just wonder if overt evidence of doubt as a precondition for earnestness (viz. "I have struggled long and hard with this question, but finally I come to realize that X") is not in itself a kind of signal to the skeptically-trained.
posted by Errant at 12:44 PM on February 15, 2011


I just wonder if overt evidence of doubt as a precondition for earnestness (viz. "I have struggled long and hard with this question, but finally I come to realize that X") is not in itself a kind of signal to the skeptically-trained.

Well, there are lots of kinds of signals. Some are are truth-tracking, and some self-deceptive, some are honestly self-deceptive and some are dishonest (other-deceptive), some are costly and some cheap. Saying "I've struggled long and hard with this question" is certainly intended to signal something, specifically expertise or insight. If someone has honestly struggled in that way, that's not proof that they're right, but it is a good indicator that they've got something worth say. There will always be folks who abuse this kind of language to dishonestly signal ability (Glenn Beck struggles with things all the time, allegedly.) But it doesn't seem particularly self-deceptive. Long years of study, combined with humility and fallibilism, look like the kinds of signals that are both costly and difficult to fake. Why should I believe you have insight into something (climate change, say) if you can't respond to counter-arguments? "Struggling with something" is an expensive but easily evaluated signal that you can respond to objections and counter-arguments, because you've thought about them before on your own. Since it's so easy to test, it's hard to fake! And since it's hard to fake, it's more likely to estimate truth.
posted by anotherpanacea at 3:14 PM on February 15, 2011 [1 favorite]


« Older Phideaux Xavier   |   For better or worse Newer »


This thread has been archived and is closed to new comments