"Facts just twist the truth around, facts are living turned inside out."
March 3, 2017 5:27 PM   Subscribe

Why We Believe Obvious Untruths. "The situation is vexing because it seems so easy to solve. The truth is obvious if you bother to look for it, right? This line of thinking leads to explanations of the hoodwinked masses that amount to little more than name calling: “Those people are foolish” or “Those people are monsters." Such accounts may make us feel good about ourselves, but they are misguided and simplistic: They reflect a misunderstanding of knowledge that focuses too narrowly on what goes on between our ears."
posted by storybored (22 comments total) 17 users marked this as a favorite
 
So. I don't KNOW the earth rotates around the sun because I myself cannot articulate or express all the observational data and physics that fully shows and explains that the earth rotates around the sun. I only "know" because someone else knows all the details. I know because they know. They seem to claim in this article that knowing is only a communal property and not a quality of an individual. I'm glad I know that someone else knows that this article is probably on the weak end of being worthwhile and insightful.
posted by njohnson23 at 5:39 PM on March 3, 2017 [3 favorites]


I think a recent feature on Dan Sperber offers a nice complement to this piece. He also co-authored a paper [PDF] that appeared previously on Metafilter: "Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions ... Our hypothesis is that the function of reasoning is argumentative ... A wide range of evidence in the psychology of reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis."
posted by Wobbuffet at 6:02 PM on March 3, 2017 [4 favorites]


Ooh, the limits of knowledge and understanding? Let's see what they have to say!
Recently, for example, there was a vociferous outcry when President Trump and Congress rolled back regulations on the dumping of mining waste in waterways. This may be bad policy, but most people don’t have sufficient expertise to draw that conclusion because evaluating the policy is complicated. Environmental policy is about balancing costs and benefits. [...]
Sorry, but fuck these authors and their prevaricating nonsense five ways from Sunday. If your nice philosophical point is best demonstrated by trying to excuse blatantly unethical behavior, maybe we should take a closer look at why you're making it in the first place. Someone called in a bomb threat to my four month old nephew's day care this week, but, you know, do we even know what, like, facts are, man? Maybe we're all just brains floating in jars on the dark side of the mooooooon.
posted by phooky at 6:54 PM on March 3, 2017 [39 favorites]


I think the example of dumping waste into waterways is a terrible choice. The writers should have chosen one that was a better fit with their argument. Overall, it was kind of useful to me but I don't understand how that conclusion, that we should demand nuanced policy from our leaders, actually fits with what they've been saying. Apart from the fact that some of us have been demanding simply that are members of Congress show up at a town hall meeting in person and met failure.
posted by Bella Donna at 7:13 PM on March 3, 2017 [1 favorite]


Also, where was the "obvious" part of the untruths? There was no example of anything actually untrue except the researchers who told people about something they made up and people were willing to believe them. In summary, for me this post is a land of contrasts.
posted by Bella Donna at 7:16 PM on March 3, 2017


I'm also pretty comfortable with classifying antivax and climate change denial and whatnot as "foolish" or "monstrous," honestly. There's, like, one extra step involved in evaluating the quality of one's sources before one decides which ones are trustworthy.

I might not be able to fully describe all of the history and astronomical data that led us to conclude that the sun was a star around which our planet orbited, but I've certainly read several accounts of the process. I understand how they reach their conclusions and am therefore comfortable endorsing the theory.

I mean, I understand the general thrust of the article, which is that we rely on experts to understand things in detail and we have a tendency to conflate what we learn from trusted experts with our own understanding, but this does not strike me as a particularly penetrating insight, and further does not seem to have much explanatory power for "why people believe untrue things." It seems to me that this particular problem stems from the fact that people prefer information that supports what they already believe, and so in an environment in which it is easier than ever for people to claim expertise and obtain a wide platform for distributing their message, it becomes more difficult to sort through the various sources and identify the actual experts amid the various charlatans and lunatics and much easier for people to simply choose an "expert" who says what they want to be true and not bother to actually perform the work of comparing sources etc.

Which in turn is intellectually lazy and, yes, "foolish." *shrugs* I mean, not everyone needs to be an expert, and if someone is busy with their life and doesn't have time to investigate, say, antivaccine nonsense fully, then I don't necessarily blame them for that. I will blame them if they then argue from ignorance against the actual expert opinion because they don't want to expend the effort to research properly or experience the discomfort of learning that their prior assumptions might have been incorrect.
posted by Scattercat at 7:46 PM on March 3, 2017 [2 favorites]


The earth going round the sun seems like a bad example. Practically everyone reading this kind of article has had an eduction which presented them with all the facts and evidence necessary to form a pretty solid opinion about it, which is easily reinforced if you happen to look up at the stars or think about gravity. That smoking causes cancer, plenty of us have seen that folk wisdom confirmed by experience, if only second- or third-hand: maybe like me you had an aunt who died of it.

The "mining regulations" thing comes pretty close for me. It's exactly the sort of thing it takes a little bit of effort to avoid jumping to whichever seems the obvious conclusion based on precedent and prejudice. Having formed a wrong opinion from the start it might be easy to wander along mentally defending it to yourself, then expressing it to others, then really believing it. Reading the linked NYT article about it doesn't give enough information to decide either way on this one if you think about it carefully and honestly. You might trust the opinions of others, you might trust your own prejudices when it comes to anything Trump does, but it takes slightly more to have actual knowledge of the thing. Unfortunate that it doesn't take much more in this case, there's no need to do any deep research or thinking to see that the evidence is very one-sided in this case and your knee-jerk reaction was correct if your values are anything like typical. There's no shortage of more questionable regulations they could've picked instead.

The description of people's beliefs about the glowing rocks doesn't seem exactly to the point either. A natural interpretation of the result they describe is that people rate their own understanding higher if they're told it's not a mystery to science because they assume that means it's one of the glowy things they've already heard scientists explain, like radium paint or bioluminescence. If they're told the result is being kept secret for unknown reasons, of course it appears less likely to be one of those already-known things. I don't know what the real story is with this series of glowing rocks experiments, but what's presented in this article isn't enough to make the point it's trying to make.

After all that, not getting hold of any real examples to the contrary, it just reinforces the thing it's trying to disprove, the idea that "those people" are not well equipped to separate fact from fiction, but we NYTimes readers certainly are.

And that is my effort to think critically about what I've read; I heard somewhere that's a good thing to do.
posted by sfenders at 8:12 PM on March 3, 2017 [1 favorite]


Huh. I don't get the pushback on this essay which strikes me as both true and important, though perhaps also rather obvious to anyone who's thought about it before. The gist of the argument isn't in supporting weak or secondhand knowledge, much less Trump's policies, but attempting to explain why some people will take what the Trump regime says as a likely truth while others will take it as a destructive lie.

The essay is limited by not providing any suggestion over how to best assess truth or how to move people from one view to another or even suggest, other than mildly, which "side" is right in the mining waste issue, so I can see why some may read the essay as thereby somehow accepting or endorsing that perspective, but the tone clearly strikes me at least of one trying to avoid specific disagreements to address what the writer sees as the larger issue shaping our responses to ideas in general. It's easy, of course, to point to some belief or truth that we will all agree on and thus hold that up as an obviously correct value that anyone who disagrees with is in error about, but even many scientific truths of today have not always been so self-evidently correct as the knowledge of the time held sway over belief and favored other knowledge as true.

This isn't to defend the idea that truth is relative, but limited by our own interactions and connections. Our shared knowledge is always limited by who and what we know, directly or through secondary sources and how we capable we are of assessing the veracity and meaning of the information we are provided. I, for example, believe climate change poses a real and grave threat to the planet due to the amount of information I have read from a large variety of sources over many years, but my assessment of that information is only weakly connected to the actual scientific facts as I've not examined them directly myself nor have the background necessary to test that info for support or to counter it.

Instead I rely on the sheer numbers of people who hold positions of significance in that field to assess the data for me and report their findings accurately, which I trust backs up their claims strongly enough to be considered meaningfully accurate to the best of their knowledge. It's relying on the strength of numbers of supporters and my trust in the work they do as validated by accomplishments in other areas that have proven verifiably true previously. (An example of verifiable truth being things like landing a man on the moon, or developing cures for disease; things that have visible proof of their existence and success.)

Now while I may prove to be in error placing my belief in numbers and past history in some instances where new, previously unknown information arises or where a minority assessment comes to eventually seem the more accurate, others may not have had the same access to information I have or have placed their trust in different sources or have different biases, so they will hold as true something at odds to my perspective. The importance of this is in how we, as larger collectives of individuals, band together around our beliefs and sources of trust and information and how we use those sources and challenge the ones opposed to them. While those here will likely almost completely agree on things like scientific method as being a foundation of how we assess factual information, we still must rely on some level of trust in those who claim to use it to conduct their studies, at least until the point where volume makes distrust untenable save for mass conceptual error in understanding the data.

The problem is that trust isn't always warranted and we are likely to accept information based on it confirming our own biases without further examination, or that good faith efforts are sometimes simply in error, so we can give our trust too quickly to ideas later shown be inaccurate. More difficult still is the trust we place in others regarding things that aren't so readily subject to scientific method for verification. We trust self reportage and general concepts that seem reasonable and that support values we hold or would like to believe. Some of these issues are not verifiable in any concrete sense, they are social agreements over the way we would like to see society function, but we often hold them as truths in a more definitive sense, which can pose difficulties in expanding those ideas by opposition from those who do not share the same value base or sets of relationships.
posted by gusottertrout at 8:49 PM on March 3, 2017 [7 favorites]


Why We They Believe Obvious Untruths
posted by Segundus at 8:50 PM on March 3, 2017 [2 favorites]


A problem with listening to too many podcasts is that this entire article reads like a poorly summarised precis of an interview I listened to recently.

Another problem with listening to too many podcasts is that for the life of me I can't remember which one it was, where I heard all of this (right down to the examples), but argued much better by the academic who actually put the original argument together.

TL;DR: if you want to write for the NYT, maybe on a lazy week just rehash something you listened to on your commute, but badly.
posted by UbuRoivas at 8:55 PM on March 3, 2017 [3 favorites]


It's a fine observation, one I make occasionally myself, so I'd be hypocritical to completely trash it.

And what's the actual point with their framing around the partisan debate? Are they arguing both sides are equally wrong on scientific facts? Clearly not. But then it still seems perfectly justified to get really mad at a side that seems indifferent to truth and does a far worse job than I do at self-correcting their world view.

I mean, if you have these heuristics involving what to believe, and they lead you to make the right decision about polluting waterways, that's kind of super relevant, right? And if the other guys reach the wrong decision and have the power to carry it out it's an issue. I'm in the same situation as before. Just instead of saying "they are stupid and don't know anything" I say "they are stupid and don't have any good techniques for figuring out what's true or false."

Could've been a straight science article I suppose but not sure how interesting that would be either.
posted by mark k at 9:31 PM on March 3, 2017


It's a fine observation, one I make occasionally myself, so I'd be hypocritical to completely trash it.

And what's the actual point with their framing around the partisan debate? Are they arguing both sides are equally wrong on scientific facts? Clearly not. But then it still seems perfectly justified to get really mad at a side that seems indifferent to truth and does a far worse job than I do at self-correcting their world view.


Yeah, I agree with this, the only value to the essay seems to be in laying out an explanation of where disagreements can come from in somewhat equitable terms, and perhaps shed some light on the notion for anyone who hadn't looked at it, but beyond that it doesn't add much of anything in terms of what to do about the situation.
posted by gusottertrout at 9:51 PM on March 3, 2017


More difficult still is the trust we place in others regarding things that aren't so readily subject to scientific method for verification. We trust self reportage and general concepts that seem reasonable and that support values we hold or would like to believe.

Well that's not a problem with how we approach knowing and understanding but a requirement because people's interior lives and subjective experience are consequential and meaningful facets of reality and have real predictive and explanatory power.

Einstein once made the following remarks on the epistemological limits of scientific observation, but I think the quote is just as relevant to thinking about meaning and human affairs more generally:

Physical concepts are free creations of the human mind, and are not, however it may seem, uniquely determined by the external world. In our endeavour to understand reality we are somewhat like a man trying to understand the mechanism of a closed watch. He sees the face and the moving hands, even hears its ticking, but he has no way of opening the case. If he is ingenious he may form some picture of a mechanism which could be responsible for all the things he observes, but he may never be quite sure his picture is the only one which could explain his observations. He will never be able to compare his picture with the real mechanism and he cannot even imagine the possibility or the meaning of such a comparison. But he certainly believes that, as his knowledge increases, his picture of reality will become simpler and simpler and will explain a wider and wider range of his sensuous impressions. He may also believe in the existence of the ideal limit of knowledge and that it is approached by the human mind. He may call this ideal limit the objective truth.

Our observations of reality, our understanding of the meaning of events, depends on an awareness and acknowledgement of the reality of subjective experience, beliefs, intentions, etc. Meaning is partly social and depends on some baseline assumptions of cultural and social values. You can't really understand the meaning of others' behaviors without having at least some working understanding of their persistent beliefs and intentions over time. The less intimately we know each other on a personal level, the less able we are to trust and get along and the less common a basis we have for reasoning through problems and reaching consensus.

It's tempting to want to take a hard behaviorist line and dismiss all self-reporting as invalid and misleading because people's beliefs and attitudes can be so idiosyncratic and bizarre and unreliable and self-rationalizing. But people are not philosophical zombies. There's ample evidence, as Hofstadter and Dennett and others have shown, that consciousness as a macro level phenomenon can exert downward causal influence on the lower level organizational structures of the brain in a strange loop effect.

Imagine trying to construct an accurate summary of a scene shown in a random still image on a computer display without any other contextual or social cues about the image. Let's say it looks like a fairly typical American family outdoor cookout but with one potentially disturbing detail: a small child looking on with an expression of terror as an older boy licks his chops and seems to be preparing to carve into a crying baby with an electric carving knife.

Is it a joke about how "sweet" the family's newest addition is? A future Florida Man headline in the making? A glimpse into the daily routines of a terrifying secret baby eating cult?

It's not hard to imagine any of a virtually infinite number of interpretations that, in the absence of any definitive social or cultural knowledge or assumptions, might provide one or more useful and factually accurate narratives consistent with the evidence the digital image seems to show. Suppose after carefully considering statistical probabilities of likely human behaviors and performing other kinds of independent, scientifically informed analysis, including determining the likely location of the photo based on environmental clues and other objectively factual features of the scene depicted, still without any reference to the subjective dimensions of the scene, the individual people and their inner lives and thoughts and feelings. Maybe, over time, you manage to develop a really elaborate, nuanced, and indirect evidence-based story about what's really being shown in the picture based on reasonable, statistically supported assumptions about them based on their apparent demographic profiles.

Before long, you might start to feel pretty confident your interpretation of the picture's meaning is pretty close to the true mark, and in any case, at least a reasonable interpretation that likely isn't so far off the mark as to be uselessly disconnected from reality or seriously misleading.

But suppose what you and all the other test subjects in this experiment don't know is that this particular digital image, photorealistic and seemingly meaningful as it appears, is completely fake, a randomly generated digital image created using a computer program that analyzes billions of real photos and recombines and recomposes them into photorealistic mashups.
posted by saulgoodman at 4:03 AM on March 4, 2017 [4 favorites]


The Stone is a much better NYTimes column on this sort of topic, Gray Matter can be hit or miss. These professors took some otherwise interesting empirical results about the concept "knowledge is maintained through sociocognitive extension" and turned it into absurd ideology:

"A better understanding of how little is actually inside our own heads would serve us well."

This is just analytically bad. We don't need more technocracy.
posted by polymodus at 4:14 AM on March 4, 2017


If it helps, it could be pointed out that this article doesn't have comments (The Stone articles do, and lots of good comments), because it is an op-ed. One can get a sense of NYTimes op-eds by reading many of them over time. This op-ed has as one of its authors a marketing professor in a school of business, so I'll leave the obvious critique as an exercise.
posted by polymodus at 4:20 AM on March 4, 2017


"Plenty of liberals believe that G.M.O.s are poisonous" might be a good example of something easily believed by NYT readers and writers that's probably wrong for most definitions of "plenty". Haven't most of the anti-GMO people moved on to other arguments since the 1980's? Surely there are some better candidates in that area for widespread beliefs that aren't backed up with facts and reasoning.

My personal opinions on GMOs are not really well-formed: They may be the only way to feed the world, they may promote the already-present tendency to large-scale monocultures dependent on chemicals with poorly-understood environmental effects, they can be of use in adapting to global warming, they might exacerbate some counterproductive tendencies of a hypercapitalist agricultural system, they are perfectly safe to eat, and there is very little risk they're going to come up with a ferocious three-legged, four-winged chicken that escapes the lab and leads to the extinction of housecats.
posted by sfenders at 4:40 AM on March 4, 2017


there is very little risk they're going to come up with a ferocious three-legged, four-winged chicken that escapes the lab and leads to the extinction of housecats.

Have you considered a world without housecats?

Because the terrorists have.
posted by saysthis at 5:39 AM on March 4, 2017


ferocious three-legged, four-winged chicken that escapes the lab

I do hope that someone with connections to the SyFy Animal Planet network reads this forum!
posted by sammyo at 5:56 AM on March 4, 2017


The truth is never objective as it never exists on its own. It exists in people's minds and becomes vocal through their thoughts, expressions and actions. Therefore, the truth is always subjective, as different people have different views, opinions and perceptions of the world. Some facts, on the other hand, exist on their own. But not in written sentences or spoken words, but in events and phenomena.
posted by HelenShepl at 9:56 AM on March 4, 2017


In West Virginia, they used to dump mine tailings, i.e., waste, down the side of the mountains, where it would, what with gravity and all, wind up in streams. Over time, it would create a dam, a crappy, non-engineered dam. Water builds up behind a crappy dam, and the dam gives way. The resulting flood is likely to kill people downstream. So, I know a little bit about it, and the environmental regulations are actually really great. Unless you prioritize profits over people, which mining companies have a sold history of doing.

I can't keep up with the Trump threads on MeFi, let alone make informed judgements about every regulation. So I vote for people who seem to have the needs of people at heart, and who seem to be decent, ethical, common-sense, non-assholes. Ergo, I didn't vote for Trump or any of the GOP fuckers who are cheerfully intent on pillaging the US while they have the chance.

All I can say in every on of these threads is WTF? We are just so, so screwed.
posted by theora55 at 12:29 PM on March 4, 2017 [1 favorite]


The truth is obvious if you bother to look for it, right?

My philosophy prof called that 'naive realism'.

Au contraire, mon ami. 'Truth', like bubble tea, comes in many flavors. Any 'truth' worth looking for is usually hard. Try convincing new physics students that waves aren't 'water travelling across the surface'. But oh, it's so obvious.
posted by Twang at 2:47 PM on March 4, 2017 [2 favorites]


One of the things you can't help notice as an analyst is that most "simple ideas" really aren't all that simple once you deep dive on them. When people simplify, they just let a lot of details and complications they aren't interested in seeing go unnoticed. Even the straightest looking lines in the world start to look crooked and more elaborately structured than they seem if you keep looking closer. That's why we have to learn to communicate very precisely and clearly, but not be as inclined to ignore details and dismiss other points of view.

Personal narratives matter in the bigger scheme of things. When people report publicly on their narratives and expose them to scrutiny, it allows us to gauge how reliable their inner narratives are. For public figures and leaders, that's important to know.

It's true a lot of genuinely nasty manipulators use personal narrative--often wildly distorted and embellished accounts of their own personal history--to sell themselves and their political snake oil. But that's not because storytelling is a mark of a manipulator, per se, but because it's human nature and perfectly ordinary social habit to be persuaded by emotionally appealing personal stories.

An analogy might be that the narrative manipulators have learned to hack the routine function of certain channels along which normal, healthy human social connections form.

Undoing that requires a very careful, measured approach that recognizes we can't just dispense with telling and interrogating each other's stories completely to protect against the unreliable narrators and narratives. We need to know the stories of the people we depend on and relate to, we need to know them and their characters as individuals to understand their motives and evaluate their narratives. Otherwise we're just choosing to be led by and relate to strangers whose motives and personal agendas we never even make an attempt to understand.

Maybe we're all sick of and skeptical of personal stories by this point, but that doesn't mean we've got any better medium for forming the kinds of deeper personal connections and bonds of reciprocal care and support that it takes to make a society durable and stable enough to sustain.
posted by saulgoodman at 7:30 AM on March 5, 2017 [1 favorite]


« Older The average dog is a nicer person than the average...   |   Rachel Dolezal: ‘I’m not going to stoop and... Newer »


This thread has been archived and is closed to new comments