Facts are Getting the Best of them.
July 6, 2020 1:35 PM   Subscribe

Why Facts Don't Change Our Minds. "Sloman and Fernbach see in this result a little candle for a dark world. If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views. This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”

Another article with the same title has a different focus.

" If you divide this spectrum into 10 units and you find yourself at Position 7, then there is little sense in trying to convince someone at Position 1. The gap is too wide. When you're at Position 7, your time is better spent connecting with people who are at Positions 6 and 8, gradually pulling them in your direction."

---

"I have already pointed out that people repeat ideas to signal they are part of the same social group. But here's a crucial point most people miss: People also repeat bad ideas when they complain about them. Before you can criticize an idea, you have to reference that idea. You end up repeating the ideas you’re hoping people will forget—but, of course, people can’t forget them because you keep talking about them. The more you repeat a bad idea, the more likely people are to believe it. "
posted by storybored (62 comments total) 36 users marked this as a favorite
 
> “Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

The thing I really don't understand about this phemomenon is that *once formed* a view cannot be changed, but the original formation of the view itself? It can just waft in on the softest of breezes and latch itself to your brain forever. A lot of people are apparently so functionally incapable of processing information, new or otherwise, that it should be impossible for them to form any views in the first place.
posted by The Card Cheat at 1:55 PM on July 6, 2020 [8 favorites]


Be kind first, be right later.
posted by meinvt at 1:58 PM on July 6, 2020 [21 favorites]


That's why they're desperate for authoritarians to just tell them what to do.
posted by seanmpuckett at 1:59 PM on July 6, 2020 [10 favorites]


I've seen several articles in this vein now, but the one thing missing from each seems to be connection between knowledge and identity. Attachment to ideas is reinforced by community, but communities are selected based on the acceptance of certain identities.

I don't believe that an identity based on, say, an economic philosophy is as important as one based on, say, gender, but I do think it would be useful to consider how traumatic it is when we try to deny and extinguish identities. I think it's why we get so emotional when our beliefs are attacked.

I know that there are a lot of ideas that I'm basically stuck on -- that I identify with -- but it really changes the way you interrogate your own worldview when you say, "why do I identify with this idea?" rather than, "why do I believe this?" I think maybe the reason we don't do it more often is because everyone is implicated.
posted by klanawa at 2:13 PM on July 6, 2020 [18 favorites]


Very possibly a evolutionary developed trait, it should not take too many repetitions or a 'proof' for the idea of big tiger --> run fast becomes an ingrained meme.

Is there a good reference on how to change minds? There's a great scene on the new star trek movie where Pike is warping to their destruction, young Kirk, totally disregarded at that point explains out a couple facts and Spock and the rest agree, change their minds and the ship is saved.

In the real world that is much harder without a dramatic cut that shows the truth. Perhaps some templates or mini-scripts would be helpful for us non-super-rhetoricians to work individual discussions away from fake memes.
posted by sammyo at 2:20 PM on July 6, 2020 [1 favorite]


This is a fascinating and excellent article all the way until about 2/3rds in my head exploded:
“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write.
This is one of the conclusions they draw from a study where these researchers asked US Americans for their opinion on what USA should do about Russia annexing Crimea, and found a correlation between the test subjects' support for military intervention with their inability to locate the Ukraine on a map.

It's a breathtaking statement when you think about it. Their conclusion is (approximately) THE OPPOSITE of what their study showed (viz., "as a rule, blithe, cavalier, thoughtless support for deadly violence against others does not emerge from a deep understanding of these others").

Perhaps they meant to say: "As a rule, Strong feelings about the issues do not necessarily emerge from deep understanding". But even this modified, technically correct statement is a dangerous one unless they also note that strong feelings which DO emerge from deep understanding are the only credible sources of understanding for the rest of us.

I don't think it's an accident that these researchers made an egregious error in their wording (to take the most generous view) and their error just so happens to reinforce the male supremacist and white supremacist dogma unconnected "objective" "rational" third parties are the best judge of everything.
posted by MiraK at 2:21 PM on July 6, 2020 [13 favorites]


as a wise man once said: "facts are meaningless. you can use facts to prove anything that's even remotely true."
posted by entropicamericana at 2:39 PM on July 6, 2020 [12 favorites]


As a rule, strong feelings about issues do not emerge from deep understanding"

can be accurately restated as

"As a rule, deep understanding does not generate strong feelings"

Which I hope makes the problem with their statement a little more obvious. They obviously did misspeak! That statement is ridiculous.

The Tolstoy quote is more like, "As a rule, strong feelings about issues emerge from falsely believing oneself to possess a deep understanding," which is not implied by either of the above. The researchers' are telling us where strong feelings as a rule DON'T emerge from ("deep knowledge"); they make no claims at all about where strong feelings DO emerge from, as the Tolstoy quote does (falsely believing you have deep knowledge). And I mean, Tolstoy is a white dude also. He too is encouraging us to think that people who have strong feelings are necessarily deluded, never considering that hullo, I have strong feelings because it affects me and it's a matter of life and death for me and I've lived all my life experiencing this and I know it?!
posted by MiraK at 2:41 PM on July 6, 2020 [9 favorites]


Assuming the researchers are correct, how do they plan to convince everyone else?
posted by Faint of Butt at 2:48 PM on July 6, 2020 [10 favorites]




Politically speaking, part of where strong feelings emerge from is the sports-team mentality that modern politics have evolved into.

If there's a gritty, claw-and-scratch kind of player on another hockey team, let us say, someone who regularly bends the rules, drops cheap shots in at every opportunity and revels in the boos from the crowd... he's a scumbag. He's dirty. He should be thrown out of the league. He's what's wrong with sports today. Oh, wait, he just got traded to OUR team? Huh. You know... there's something admirable about him. He always gives 110%. Look at what he just did to their center! hah! He's gutsy. He's our kind of guy.

Which helps explain how many a politician morphs from public enemy number one to Our Hero with very little provocation, and often right back with the next vote. It's all about what you have done for me lately.
posted by delfin at 2:53 PM on July 6, 2020 [5 favorites]


The sports team mentality is not is the only possible source of strong feelings. In fact, genuine deep knowledge of a subject almost invariably gives people strong feelings about it. It's only white- and male- supremacists who think strong feelings are always irrational!

To put it another way, does anyone here think that
Crimeans themselves have zero strong feelings either way about whether or not USA launching a military intervention in Crimea?

"As a rule, the more deeply to know Ukraine and Crimea, the more detached and apathetic you will feel about this subject"? LOL
posted by MiraK at 3:00 PM on July 6, 2020 [12 favorites]


accurately restated as

"As a rule, deep understanding does not generate strong feelings"


Not a terribly good rule. For example, the more I come to understand about the ways in which the brutal truths about the colonization of this country have been and for the most part continue to be whitewashed in the educational systems of the colonizers, the more disgust and dismay I feel about smug useless parasitic fuckwits who promote the lie that colonization and civilization mean the same thing.

Holding deep understandings that appear to be deliberately avoided by the majority of one's compatriots is a painful and disturbing experience.
posted by flabdablet at 3:06 PM on July 6, 2020 [9 favorites]


flabdablet, when I said "accurately restated", I meant the restatement was logically equivalent to the original, not that my restatement corrected the wrong ideas in the original!
posted by MiraK at 3:12 PM on July 6, 2020


i'm sensing a little heated agreement here
posted by flabdablet at 3:16 PM on July 6, 2020 [7 favorites]


"As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write.

Their conclusion is (approximately) THE OPPOSITE of what their study showed (viz., "as a rule, blithe, cavalier, thoughtless support for deadly violence against others does not emerge from a deep understanding of these others").

"As a rule, the more deeply to know Ukraine and Crimea, the more detached and apathetic you will feel about this subject"? LOL


The quote is not in any way specifically about war or about studies they performed, in fact if it's about any policy in specific, it's about food labelling. The previous paragraph has one sentence about the Crimea example and is mostly concerned with a study that showed that 80% of people approved of genetic engineered food being labelled, but that the same 80% of people approved of food "containing DNA" being labelled (which is of course all plant or animal matter). The paragraph before discusses a survey around a Supreme Court ruling, where 76% of the people had an opinion of the ruling, and only 55% actually correctly identified what the ruling was (and it was a binary choice - to overturn or uphold a law).
How seriously should we take the vote to label genetically modified foods if it comes from the same people who believe we should label all foods that contain DNA? It does seem to reduce their credibility. Apparently, the fact that a strong majority of people has some preference does not mean that their opinion is informed. As a rule, strong feelings about issues do not emerge from deep understanding. They often emerge in the absence of understanding or, as the great philosopher and political activist Bertrand Russell said, “The opinions that are held with passion are always those for which no good ground exists.” Clint Eastwood was more blunt: “Extremism is so easy. You’ve got your position, and that’s it. It doesn’t take much thought.”
posted by Homeboy Trouble at 3:16 PM on July 6, 2020 [3 favorites]


> The quote is not in any way specifically about war or about studies they performed

You are mistaken, Homeboy.

FT(New Yorker)A:
Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)

Surveys on many other issues have yielded similarly dismaying results. “As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write.
posted by MiraK at 3:19 PM on July 6, 2020 [1 favorite]


Well, far be it from me to disagree with great philosopher and political activist Bertrand Russell, or noted academic Clint Eastwood.

Except, of course, when they're talking complete bollocks.
posted by flabdablet at 3:20 PM on July 6, 2020 [1 favorite]


As someone who does research in cognitive psychology, I have to say I find the manner in which many researchers report the implications of these findings is deeply frustrating. On the one hand, it's unambiguous that a lot of cognition is highly motivated; that is, we are predisposed to favor certain kinds of evidence and argument if it conforms to our present understand or our current objectives. However, from this, the authors conclude that "reason" is, effectively, nonexistent, and that only sentiment exists. It's a curious thing to try to argue quantitatively, because it's self-defeating and, if pushed to its extremes, morphs into a kind of cognitive relativism.

One of the motivations behind this kind of excessive generalization is, of course, to sensationalize the results in order to provoke a reaction from their peer, the press, and the public. But a more subtle component of the "evidence for irrationality" is that studies demonstrating it very often involve either low-stakes problems, low-stakes ideas, or both. If I'm participating in a study (and especially if I'm a college student doing it for credit), chances are good that I'm not terribly invested in my responses. There is not, so to speak, going to be a test later, and my responses are even freer of accountability than my ordinary day-to-day social interactions (because the experiment is, as a rule, a stranger with whom I do not expect to have regular future interactions). If what these studies are measuring is "how people perform when their behavior is kind of half-assed and has no grounding," it shouldn't be a surprise that people often give answers that are somewhat arbitrary, which in aggregate will produce a mean that appears irrational. If you hand someone a book of puzzles and tell them to kill tie, they'll approach the problem differently than if you hand them the same book and tell them it will determine whether they can graduate.

Another issue academics of this stripe routinely fail to consider is that things like the logical robustness and internal consistency of a set of ideas can have both moral and aesthetic dimensions. People are powerfully motivated by taste and morality, and many academics have cultivated an identity that include the belief that "arguments that are consistent are more likely to be true" and "axioms that conflict with empirical evidence should not be relied upon." These are values that can be taught and learned, and they can establish powerful social norms. Importantly, although all norms are subjective, rigorous norms about logic and proof can give rise to objective results, as demonstrated nowhere more remarkably than in mathematics. Regardless of the aesthetic and intuitive path a mathematician may take in building their proof, a solid proof remains true independent of anything beyond its axioms. This sort of careful deliberate thinking is most likely not how humans behaved as nomadic hunters, and is not how the vast majority of humans alive operate either. It is a highly bizarre cognitive approach, but it is a tightrope onto which a person can choose to climb and can work hard to stay atop of.

The demonstration that "ordinary people" (or, again, more likely, college students) don't reliably apply such criteria to their own reasoning may have been surprising in part because academics often confuse their own norms for humanity's essential character. Americans, especially, seem to me to be predisposed in favor of arguments about "talent" and suspicious of arguments about "skill." Reason, it seems to me, is a skill, and it's quite unnatural (in that we fool ourselves all the time and have to work constantly to undo our own mistakes), but it is possible to employ, albeit imperfectly. We are, after all, just apes. Seen from this perspective, studies like those described in the article stop seeming like mind-blowing revelations and become much more pedestrian. They shift the focus from "Can reason be counted on?" (answer: certainly not in people who never learned the skill) to "Can truth and consistency be encouraged as societal values?" (answer: possibly, but not without putting energy toward doing so).

As an aside, I'm very aware that both academics and the broader swath of people who fetishize "reason" are often guilty of blind spots, bad argumentation, and overt prejudice. Just because someone values reason does not mean they are very skillful at it, and no amount of skill manipulating information is going to keeps someone working with bad facts and bad axioms from drawing erroneous conclusions. Nevertheless, I think there's a lot of harm in conflating "default human cognition" with "reason" because it (to most people) appears to undermine the possibility of approaching truth, rather than illuminating how much distance most of us have to cover.
posted by belarius at 3:21 PM on July 6, 2020 [57 favorites]


It's actually a very important survival skill to not change your beliefs/actions just because someone appears to be smarter, is a better talker than you, and seems to be using facts to make a convincing argument. No matter how clever you think you are, there's almost always a con-artist who is slicker and more convincing than you are.

People with less education probably develop this skill more quickly and hold on to it with more certainty, but people with lots of education need it as well.

I have a family member who is always sending me these articles by some doctor or some scientist making a contrarian case about nutrition or COVID or whatever. And my response is always "It's pointless for me to read this. As someone who has more education in this field than you do, what I know is that I'm unqualified to figure out whether this person's argument is valid or not."

If people did regularly change their opinions because you present "the facts" and make a good case, that would be a bad and dangerous thing.
posted by straight at 3:28 PM on July 6, 2020 [15 favorites]


I don't believe this article and nothing is going to change my mind.
posted by PhineasGage at 3:30 PM on July 6, 2020 [3 favorites]


One way to look at science is as a system that corrects for people’s natural inclinations. In a well-run laboratory, there’s no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them.

I expect this draws gales of academic titters, so much discussion of poor quality papers, sketchy p values and irreproducibility resulting from the actual goal "science" being the largest and most grants, tenure, lab size and status. Which very much follows from the social nature of the articles thesis.
posted by sammyo at 3:32 PM on July 6, 2020 [4 favorites]


flabdablet: "i'm sensing a little heated agreement here"

Yeah, you two should just agree to agree on this one.
posted by chavenet at 3:34 PM on July 6, 2020 [6 favorites]


(I just wanted to note your title put an impression of Talking Heads in my brain, and I appreciate that because I have strong feelings for their music.)
posted by rewil at 3:34 PM on July 6, 2020 [2 favorites]


I've been thinking about this stuff lately - not with much success, but rolling it around in my mind - and I found this quote from David Graeber's "Utopia of Rules" interesting:
The term “rationality” is an excellent case in point here. A “rational” person is someone who is able to make basic logical connections and assess reality in a non-delusional fashion. In other words, someone who isn’t crazy. Anyone who claims to base their politics on rationality—and this is true on the left as well as on the right—is claiming that anyone who disagrees with them might as well be insane, which is about as arrogant a position as one could possibly take.
I've also been browsing the anti-Rationalist-with-a-capital-R SneerClub Reddit forum recently, and someone made this point, which I also found interesting:
You can't really talk about rationality without talking about values, but they don't talk about them because most values sit on an ultimately irrational foundation. So they stumble around optimizing rationality with respect to their shared unexamined values.
Since I read that, I've been wondering whether the extremely narrow demography of capital-R-Rationalism (95% white male nerds) has something to do with that. If your prejudices and fears end up becoming your values because you can't admit to yourself that there's anything irrational about your values, you'll end up thinking (without realizing it) that only people who share your prejudices and fears are rational thinkers. You end up forming a group in the usual identity-based way while thinking that you're not.

These are not organized thoughts for me yet, though, and I look forward to some of y'all's thoughts sinking into my brain.
posted by clawsoon at 3:38 PM on July 6, 2020 [6 favorites]


look it's really easy to figure out what motivates people it's basic materialist theory see first you have to observe that our understanding of the world is contingent and constructed in relation to the dominant mode of production and that under our current modality there are two circuits through which people relate to the commodity system first there is the c-m-c circuit in which most of us live then there is the m-c-m' circuit pronounced em-cee-em prime which
posted by Reclusive Novelist Thomas Pynchon at 3:42 PM on July 6, 2020 [3 favorites]


It's not that facts don't change our mind, it's that sometimes we having various reasons for distrusting what is called a "fact".
posted by Liquidwolf at 3:50 PM on July 6, 2020 [4 favorites]


delfin: He's what's wrong with sports today. Oh, wait, he just got traded to OUR team? Huh. You know... there's something admirable about him.

This effect should be named after Dennis Rodman being traded to the Chicago Bulls.
posted by clawsoon at 3:56 PM on July 6, 2020


You are mistaken, Homeboy.
FT(New Yorker)A:


Yes, I've read the New Yorker article and it's a reasonable conclusion that could be drawn from the New Yorker article you quote that the authors are making some sort of claim entirely about a study (theirs or someone else's - in fact, here's the cite from the book in the WaPo) concerning military intervention in the Ukraine.

Which is why I then found the book, found the section and summarized and quoted it above. The book itself only uses the word 'Ukraine' twice and 'Crimea' never; Kolbert's done additional research into the study and added it into her New Yorker article). I'm hesitant to post too much because of copyright law and wall-of-text, but the two 'Ukraine's in further, more explicit context are: [note: the previous two paragraphs including the Supreme Court survey I mentioned above were about the ACA, public opinion and public understanding thereof]
The Affordable Care Act is just one example of a much broader problem.Public opinion is more extreme than people’s understanding justifies. Americans who most strongly supported military intervention in the Ukraine in 2014 were the ones least able to identify the Ukraine’s location on a map. Here’s another example: A survey out of Oklahoma State University’s Department of Agricultural Economics asked consumers whether the labeling of foods produced with genetic engineering should be mandatory. Some 80 percent of respondents thought it should. This seems like an excellent rationale to support such a law. The people deserve the information that they want and they have a right to it. But 80 percent also approved of a law stating that there should be mandatory labels on foods containing DNA. They believe that people have the right to know if their food has DNA. If you are scratching your head right now, note that most foods have DNA, just like all living things. According to the survey respondents, all meats, vegetables, and grains should be labeled “BEWARE: HAS DNA.” But we would all die if we avoided foods that contain DNA.
The next paragraph, which I posted above, contains the sentence. The next quotes Socrates at length, from Plato's Apology.

I suppose we can each make up our own mind about whether the best determination of the intent of the authors of a book is from a single sentence quoted in a New Yorker article partially about that book or whether it's from the actual text of the book itself.

But it's not lost on me that we're having this discussion in the context of strong feelings, facts and people changing their minds.
posted by Homeboy Trouble at 4:07 PM on July 6, 2020 [11 favorites]


When people have no skin in the game it does seem to me that this holds true. I've had some very useless discussions with other Americans about, like, foreign elections, where nobody knew anything but everybody had a strong opinion. Someone from the country in question with a strong opinion would have been worth listening to, but I honestly think most of them would be less confident in their views than we were.

I think it is an easy tendency to slip into and one worth resisting.

I have no idea how to evaluate the quality of the actual studies, though. Unfortunately my default position right now is to assume any study in psychology with a simple conclusion that I already agree with could be p-hacked or otherwise be bogus unless I know otherwise. Looking forward to the near future as research standards continue to improve and a level of trust can be restored.
posted by vogon_poet at 4:08 PM on July 6, 2020 [1 favorite]


And I do see on rereading where in my initial post I didn't make explicit that I was quoting the book directly; I thought that it should have been clear from context, but I apologize if I miscommunicated.
posted by Homeboy Trouble at 4:11 PM on July 6, 2020 [2 favorites]


Convincing someone to change their mind is really the process of convincing them to change their tribe. If they abandon their beliefs, they run the risk of losing social ties. You can’t expect someone to change their mind if you take away their community too. You have to give them somewhere to go. Nobody wants their worldview torn apart if loneliness is the outcome.
I've read very similar things about how to effectively convert people to Christianity and start new churches, so the idea has at least some support from the practical experience of people who are very interested in finding ways to change people's minds.

One side effect is that it's easiest to change the worldview of lonely people. Befriend them, invite them to dinner and Bible study for a few months, and before you know it they're believing that somebody rose from the dead. The problem with that for a church builder is that you end up with a church filled with low-status people. That makes it harder to grow your church.
posted by clawsoon at 4:13 PM on July 6, 2020 [3 favorites]


How does this study (which I seem to remember was carried out by a couple of magicians) play into all of this?
American politics is becoming increasingly polarized, which biases decision-making and reduces open-minded debate. In two experiments, we demonstrate that despite this polarization, a simple manipulation can make people express and endorse less polarized views about competing political candidates. In Study 1, we approached 136 participants at the first 2016 presidential debate and on the streets of New York City. Participants completed a survey evaluating Hillary Clinton and Donald Trump on various personality traits; 72% gave responses favoring a single candidate. We then covertly manipulated their surveys so that the majority of their responses became moderate instead. Participants only noticed and corrected a few of these manipulations.

When asked to explain their responses, 94% accepted the manipulated responses as their own and rationalized this neutral position accordingly, even though they reported more polarized views moments earlier.
I remember hearing the researchers being interviewed on the radio. It was something like: Someone might mark Clinton with a 0 for foreign policy on the survey. The researchers would pretend to look at the response, then show the interviewee a survey on which Clinton had been given a 6 for foreign policy. They'd ask, "Why did you give Clinton such a strong foreign policy score here?"

...and the interviewees would completely forget that they'd just given Clinton a score of 0. They'd give a completely rational explanation as to why they believed Clinton was okay at foreign policy. Same thing, of course, for similar questions about Trump.
posted by clawsoon at 4:21 PM on July 6, 2020 [5 favorites]


As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering.

Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about.
I don't see much of a difference between the two examples. All of us base our opinions about immigration on the opinions of (and facts brought forward by) people we trust. If you really wanted to say you had a 100% full understanding of immigration, you'd surely have to talk to all immigrants, all the people they knew in the places they came from, and all the people they got to know in the place they came to. You'd have to follow all the immigrants and all the people affected by them for years. That's obviously a ridiculously high bar, one that no human is going to achieve. Instead we use some heuristics to cobble together information from people we trust into an opinion.
posted by clawsoon at 4:35 PM on July 6, 2020 [1 favorite]


in my initial post I didn't make explicit that I was quoting the book directly

Yep, that was the source of my misunderstanding, I didn't realize you were looking at the original source. Thank you for informing me!
posted by MiraK at 5:08 PM on July 6, 2020 [3 favorites]


I do want to note: even stripped of this made-up reference to Crimea and contextualized to whatever else from the original book, that particular statement remains ridiculous and my comments here still apply to it unchanged.

As I suspected, it's just accidental bad wording - they clearly didn't mean what they literally wrote - but as I said, a mistake like this one isn't meaningless. It got published and then singled out for a New Yorker quote in spite of being utterly ludicrous only because it aligns so well with the white- and male-supremacist popular thought that we've all been conditioned to accept as normal. The opposite mistake would never have been made.
posted by MiraK at 5:28 PM on July 6, 2020 [2 favorites]


rewil: (I just wanted to note your title put an impression of Talking Heads in my brain, and I appreciate that because I have strong feelings for their music.)

"Facts are simple and facts are straight
Facts are lazy and facts are late
Facts all come with points of view
Facts don't do what I want them to
Facts just twist the truth around
Facts are living turned inside out"
posted by storybored at 5:54 PM on July 6, 2020 [3 favorites]


So trying not to be defensive and do not want to start a derail but I would modestly and with what little privileged awareness that I have managed to gain from (3rd?) eye opening discussions here for the last several years but "white- and male-supremacist popular thought" may also be something of an example of the current discussion.

(the struggle for understanding is really really hard sometimes always, don't drink the blue dye)
posted by sammyo at 7:07 PM on July 6, 2020


Honestly the first story I hear explaining some complicated thing becomes the one I defend until I realize what I am doing. I suspect that this is true for most everyone, I also suspect that it is very hard to realize what you are doing and and let go.
posted by Pembquist at 7:30 PM on July 6, 2020 [10 favorites]


Pembquist: Honestly the first story I hear explaining some complicated thing becomes the one I defend until I realize what I am doing.

"Beware the man of one book", especially if that man is myself.
posted by clawsoon at 7:56 PM on July 6, 2020 [3 favorites]


sammyo, are you asking me to clarify what that phrase meant? If so -

The idea that strong feelings are always the product of ignorance or irrationality is part and parcel of white- and male- supremacist popular thought. It is the cornerstone of male supremacy: that women are overly emotional, hysterical, and irrational is the justification men use to hoard power over women to this very day. Similarly, the notion that only white (male) brains are capable of cool reason and brilliant science, and therefore must tame the uncivilized, animal-like darker peoples who are incapable of reason or science has always justified white supremacists' war, genocide, colonialism, slavery etc.

The belief is also very popular among intellectuals (your assorted Leo Tolstoys, Clint Eastwoods, Richard Dawkinses, Bertrand Russels, Ricky Gervaises, Steven Pinkers, et cetera) who are devoted rationality, reason, science, logic, etc. They abhor emotionality which they equate with irrationality. Their standards of civil discourse demand that you (well, I) should maintain cool detachment and speak in impersonal terms, even if the "discussion" is actually me pleading for my life and a white man saying he should kill me my own good. (This is 100% real, you know that, right? See: BLM.)

White male intellectuals impose these standards of reasonable discourse on everyone because they're safe in the knowledge that their life will never be in my hands the way mine is in theirs - so they find it easy to be detached, impersonal, ergo "rational and objective". But I will likely have some strong feelings as I argue - and BAM! He's got me! "As a rule, strong feelings about the issues do not emerge from deep knowledge." Having strong feelings makes me wrong by definition. I'm out of the debate.

Only cool, detached, rational, and objective white males now remain. Some of them believe I should be killed for my own good and others may disagree, but come now, old chap, one respects one's opponents, civilized disagreement is the mark of high culture, pass the cigars, where's the brandy, what's the news from our war overseas, and did you hear the one about the #%$&, the @#^$, and the %& who walk into a bar?

.... And that's how the claim that strong feelings always come from ignorance or irrationality maintains white supremacy and male supremacy.
posted by MiraK at 8:14 PM on July 6, 2020 [15 favorites]


Just to follow up on rewil and push the curtain aside for anyone interested - and thank you storybored.
Talking Heads, Crosseyed and Painless.


Facts are simple and facts are straight
Facts are lazy and facts are late
Facts all come with points of view
Facts don't do what I want them to
Facts just twist the truth around
Facts are living turned inside out
Facts are getting the best of them
Facts are nothing on the face of things

posted by Meatbomb at 10:04 PM on July 6, 2020 [1 favorite]


For those, like me, wondering 'but wait, "deep understanding does not inspire strong feelings" can make sense if it's personal engagement that inspires strong feelings, which some people with deep understanding can have' the argument is that without some kind of personal engagement you can't have deep understanding.

This is something I keep thinking there's an obvious arguement against but hey, that's privilege for ya. Everyone knows you can't be natively fluent in a language without living in a country where they speak it.
posted by Merus at 11:39 PM on July 6, 2020


MiraK, I don't think the authors were saying deep understanding is never accompanied by strong feeling. I think they were just saying that strong feeling is usually not accompanied by deep understanding.

Just say there was a particular topic, and a large number of people had strong feelings on it. Now say that only ten percent of those people actually had a deep understanding of it. Someone might observe all these people with strong feelings and say, where do all these strong feelings come from? Surely these people with such strong feelings must have a deep understanding? And the answer would be "In general, no." This doesn't mean that people with a deep understanding don't have strong feelings, it just means they're outnumbered by people with shallow understanding who somehow also feel entitled to have strong feelings.

To me this interpretation of the (admittedly ambiguous ) statement fits better with the point being made in that part of the article.
posted by mokey at 12:52 AM on July 7, 2020 [4 favorites]


I think I've derailed this discussion quite enough with this particular thread :) here goes my last note on this topic.

1. I'm quoting verbatim and going by the literal meaning of the words, so, no, that was not a statement about anything "in general"; that statement began with the words "as a rule", a near-absolute claim. It was ridiculous. Not ambiguous in the least.

2. I've noted multiple times here that I believe this was accidental bad wording, a case of researchers saying the second cousin of what they meant to say. Homeboy Trouble provided the entire original paragraph from the book which proved my hunch correct.

3. Nevertheless, I believe this mistake is meaningful because the opposite error would be much less likely to have been made at all by the researchers, and it would never have slipped past all the book's editors, nor been singled out for a quote in the New Yorker later without correction. It would have been blatantly obvious to every single person that "As a rule, strong feelings emerge from deep understanding" is a mistake. Why is it so hard for everyone to notice the mistake now? Answer: we have all been taught to think it's true by white male supremacist intellectuals.
posted by MiraK at 2:35 AM on July 7, 2020 [2 favorites]


It is rather amazing how blind we our to our own thought when we obviously know so much about how people thought "on the savannas". A clear appeal to unclouded reason if there ever was one.

The grotesque over-simplification of all the concepts involved makes this article hard to take very seriously, other than as a cute conversation starting "think piece", which is unfortunate since that seems to be the most common way this gets reported on, a bit of everyone does it/both sidesism, based on around some vague notion of "facts" in a "scientific" sense absent associated values and the range of meaning around what could constitute "right" and "wrong" or even better and worse or more or less informed answers around any of it.
posted by gusottertrout at 3:05 AM on July 7, 2020


Language question, bear with ze furriner please... To me, "as a rule" is mostly used in the sense of "by and large"... i.e. the rule would be assumed to be valid on average, but have exceptions and outliers. Almost like a weasel phrase that softens the claim. Is this an academic thing? Or have I been misreading this?
posted by kleinsteradikaleminderheit at 3:07 AM on July 7, 2020 [1 favorite]


"as a rule" is mostly used in the sense of "by and large"... i.e. the rule would be assumed to be valid on average, but have exceptions and outliers.

I believe this is how it is generally used.

Convincing someone to change their mind is really the process of convincing them to change their tribe. If they abandon their beliefs, they run the risk of losing social ties. You can’t expect someone to change their mind if you take away their community too. You have to give them somewhere to go. Nobody wants their worldview torn apart if loneliness is the outcome.

Exactly. People are incredibly attached to the identities of those around them. It's why someone might identify as Republican or Democrat despite having personal viewpoints completely in opposition to those practiced by the party. You ask them their political party, what their view on, say, immigration is, and then what the views of the party are--and they'll say the party shares their views irrespective of what laws the party is trying to pass or the rhetoric used by its leaders. It's why you have Trump voters in Kentucky convinced he's not trying to take away the ban on excluding coverage based on pre-existing conditions despite all the evidence to the contrary. Their community is Republican, they are told respectable, smart, independent, freedom-loving people are Republican, so they identify themselves as Republican. Identification as a Democrat is "bad". It would make them "bad" and exclude them from their community. So why do it?
posted by Anonymous at 4:41 AM on July 7, 2020


One way to look at science is as a system that corrects for people’s natural inclinations. In a well-run laboratory, there’s no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them.

This one got a laugh from me. Even in the supposedly more objective bench sciences you'll be hard pressed to not find people advancing various sides, along with the attached political dimensions. The study of science as a social institution isn't exactly new, and probably bears some consideration here.

There are multiple reasons that people distrust science that don't have a whole lot to do with whatever happened on the savanah six million years ago.

One of those reasons is a widescale societal effort to discredit scientific findings. If you look at responses to COVID you can see that this is rooted in culture and also in societal attitudes. Some communities around the world were perfectly fine having their mind changed by epidemeological facts. Even within modern American history we can see a shift in valuation of scientific experts over time, especially given the focused disinformation campaigns of the right-wing media.

Another reason is that the supposedly apolitical and rational domains of science are actually very harmful to people. I remember reading an article of long-form interviews and profiles of anti-vaxxers which showed that many of them found their way to anti-vax as an ideology because they were constantly disrespected and disbelieved by their doctors, which is a verifiably true phenomenon among women.

I agree with much of the critical response up-thread: the FPP link is an article that is oddly rooted in both evopsych (which itself is a discipline that is far from being uncontested, to put it politely), and in small-scale WEIRD findings which draw samples from poor college kids looking for extra money. For an article that seems to be premised on fetishizing moderate political thought and castigating trump (granted, being written shortly after 2016), it also spent very little time looking at the cultural and political dimensions of why trump was able to so thoroughly capture 30% of the American populace's mind. Here's a hint: it's not just biology.
posted by codacorolla at 7:22 AM on July 7, 2020 [4 favorites]


OK, I RTFA, and I read (well, scanned) all the comments so far. Which was an exhausting exercise: there's a whole lot of quibbling about inconsequential stuff there, along with some trying to make universal human traits to be really about white male fragility.

It's possibly worth noting something that dovetails with the evidence presented in TFA, namely the sizable body of work demonstrating that most of what we think we perceive about the world is confabulation by our brains, and most of what we say about our motivations (even to ourselves) is after-the-fact rationalization about stuff that our bodies were going to do anyway.
posted by Aardvark Cheeselog at 8:02 AM on July 7, 2020


Here's the thing about that: the article didn't really present compelling evidence. It listed a few one-off psych studies and then postulated about "life on the savanah". That's not evidence. I won't argue that evidence for this particular line of thought doesn't exist, but it's certainly not conclusive, and is absolutely arguable. To pretend that this line of thought also doesn't have a cultural context (white male fragility as you blithely put it) is asinine.
posted by codacorolla at 10:39 AM on July 7, 2020 [1 favorite]


As a rule, strong feelings about issues do not emerge from deep understanding"

can be accurately restated as

"As a rule, deep understanding does not generate strong feelings"
"

No, those are different statements, and that is the source of your confusion, though I recognize that I will likely be unable to persuade you of such.

The first says that the source of strong feelings (implied: about a given subject) is not deep understanding of the subject. For an analogy, as a rule, eggs do not become chickens — most eggs are not chicken eggs; the statement is misleading to the extent that chicken eggs are assumed. Or, most people with strong opinions about Berlin do not live in Berlin — also fairly trivially true, because there are so many more people who do not live in Berlin compared to those who do. This does not imply that Berlin does not generate strong feelings in those who live there, or that living there precludes strong feelings about Berlin. Similarly, there are likely to be many more with strong, shallow opinions on a given topic than there are people with strong, expert opinions, because there are far fewer experts and opinions are cheap.

You might think of it as related to the conjunction fallacy of probability.

"To put it another way, does anyone here think that
Crimeans themselves have zero strong feelings either way about whether or not USA launching a military intervention in Crimea?

"As a rule, the more deeply to know Ukraine and Crimea, the more detached and apathetic you will feel about this subject"? LOL
"

The more you know about Ukraine, the less likely you are to endorse American military intervention?
posted by klangklangston at 10:40 AM on July 7, 2020 [2 favorites]


anyway. i think i bomb into all of these threads to say the exact same thing every time — never a good sign — but screw it let's go let's do this:

i think the focus on beliefs as derived from one's group identifications rather than facts per se is correct but also incomplete, and also that moving toward (what i see as) a more complete understanding can usefully inform our praxis as folks who believe we are right about things...

... and this is where i pause for a really important diversion. you'll notice that i have described us as "folks who believe we are right about things. i am here making a point: i am going to willfully ignore whether or not we are actually right about things. sure, yes, most of us here at least implicitly believe that our beliefs are better-grounded than the beliefs of antimaskers or antivaxxer or flatearthers, and we likely have good reason to consider those beliefs better-grounded... but that simply doesn't matter.

we have an epistemological strategy, founded in an admixture of the "believe scientists" epistemological strategy and the "also believe science studies people who point out the failures in the methodological practices of science as actually exists in the world" strategy. we think this epistemological strategy is well-grounded, or at least better-grounded than the epistemological strategies of others. we would like others to use our epistemological strategies. others would like us to use our epistemological strategies. there is a struggle happening. we would like to win this struggle.1

diversion over; let's return to paragraph two. people on this thread have correctly identified that many humans derive their beliefs from the social formations in which they operate, rather than from "facts" or "rational debate" or whatever. i would like to broaden this: let's say that people derive their beliefs from their material condition. that people, including you and me, believe the things we believe because in whatever terms we believe what is good for us. if you get social benefits from an identification with flateartherism, if you're in a culture that holds flatearther beliefs, considers non-flatearthers as outsiders, preferentially socializes with flatearthers over non-flatearthers, preferentially hires flatearthers over non-flatearthers, most likely you'll do one of the following things:
  1. become a flatearther (the more common response, because in at least the short term presenting as a flatearther is materially good for you, and the easiest way to present as a flatearther is to actually believe in flateartherism)
  2. develop a sense of cognitive dissonance — maybe you've gone to the ocean and held up a meter stick to the horizon and understood the implications of what you've seen — and reach a point where this cognitive dissonance causes you material pain that outweighs the material goods you receive for presenting as a flatearther. your options now become 2a) to either present as a flatearther as best you can and write down your real thoughts in a private diary, or else 2b) leave your community behind and risk starving to death in the woods or on the streets.
you might be thinking: ah, i hope i am someone who is brave enough to choose option 2b! i love 2b-ers! not-2b-ers suck! let us all strive to become 2b-ers and to encourage everyone to be 2b-ers and we'll be a whole society of galileos saying e pur si muove2 and a new age of enlightenment will be on us.

there are problems with this, though. first, it's not realistic for us to all become 2b-ers — almost no one has such a well-honed sense of cognitive dissonance that they'll reliably take 2b in all situations. moreover, most 2b-ers are kind of crap at persuading others without that well-honed sense of cognitive dissonance, because most folks see the material suffering of the 2b-ers and (reasonably) decide that they want no part of that. and, let's be frank — most people who automatically choose 2b are total loons. a reflexive application of the 2b methodology is more likely to turn you into a sovereign citizen basketcase than it is to turn you into the next galileo.

and here's the point, and here's why i militate for broadening our understanding beyond just "humans are more likely to derive our beliefs from our group identifications than we are to derive them from 'facts'", and to "humans believe what's (materially) good for them." if we want to change peoples' minds, if we want to win that epistemological struggle that i talk about in the diversion above, instead of devising methods of rational debate in order to convince others of our facts, we need to establish a new fact. we need to make it more materially good for people to believe what we believe than it is to believe what our ideological opponents believe.

this is a dangerous idea. when we acknowledge that beliefs are derived from material circumstances rather than rational debate, a range of strategies open up to us — and only some of them are moral. some of them are wildly immoral and in the long run will lead to disastrous consequences. we could make it materially better for others to believe what we believe through leveraging whatever power we have to make it materially very bad indeed for others to have the wrong set of beliefs. we could enforce our beliefs at bayonet-point. we could take up the worst version of maoism and enforce our beliefs through establishing reëducation camps then forcing our struggle-opponents into them. we could enforce our beliefs by systematically denying access to resources to people who believe the wrong things — this is how, for example, capitalist systems enforce bourgeois ideology, since everyone knows in their bones that it's a big big risk to threaten capitalist hegemony by forming a union or a socialist party. we could enforce our beliefs by throwing a good old fashioned inquisition, even though we all know that an auto-da-fé is what we oughtn't to do but we do anyway.

... but we can also make it materially better for others to hold our beliefs by building societies, of whatever scale, and inviting others into those societies as best we can, and just by generally showing through our actions that being around folks who believe the way we do is just good for them. and they'll start to come around. there's at least a couple of ways. first is the concept from leftist ideologies of "prefigurative politics." you build miniature versions of the world you want to see around you, and folks start to see how good it is, and eventually (you hope) we reach a threshold where it becomes feasible to turn the world upside-down and make your prefiguration the dominant system. another way of thinking about it is expressed in the old christian chestnut that (how does it go?) "don't tell 'em you're a christian, do good works and let 'em figure it out for themselves." i'm not a christian, i'm a socialist (not to say that christianity and socialism are necessarily fundamentally incompatible), and so the thing that resonates for me is "don't tell 'em you're a socialist, do good works and let 'em figure it out for themselves" — of course, i'm saying this in a context where i'm explicitly telling you i'm a socialist, so ¯\_(ツ)_/¯

i guess the tl;dr: version of this longwinded comment goes something like:

look it's really easy to figure out what motivates people it's basic materialist theory see first you have to observe that our understanding of the world is contingent and constructed in relation to the dominant mode of production and that under our current modality there are two circuits through which people relate to the commodity system first there is the c-m-c circuit in which most of us live then there is the m-c-m' circuit pronounced em-cee-em prime which

1: ah, you're thinking, this pynchon guy is being reductive. i don't just want to win an epistemological struggle — i want to whenever possible or necessary refine or revise my own epistemological strategy when others point out how i'm failing to meet my own standards. i am in a dialogue, not a fistfight! congratulations: you have identified an aspect of the epistemological strategy that you and i share, and that you and i would like others to adopt. this isn't a contradiction of the struggle model of epistemological dispute — what you've found is one of the things we're struggling for.
2: he didn't say it, though, that's a myth.

posted by Reclusive Novelist Thomas Pynchon at 10:45 AM on July 7, 2020 [5 favorites]


No, those are different statements, and that is the source of your confusion, though I recognize that I will likely be unable to persuade you of such.

... But I've been making the very point all along which you believe you can't persuade me of. Phew, semantic arguments are apparently very difficult to make in an understandable way! My argument IS that nobody has any basis for making claims about the source of strong feelings: neither that they do arise from deep (or shallow) understanding, nor that "as a rule" they do not arise from deep (or shallow) understanding.

But the statement's implication is we ought to disregard opinions which are accompanied by strong feelings: the implication is proven by the many, many commenters on this thread who are quoting Tolstoy and Clint Eastwood and Bertrand Russel because that's how they understood it. It's not merely a statement about source. It has an agenda.

The more you know about Ukraine, the less likely you are to endorse American military intervention?

I suspect they feel pretty strongly about it, too!
posted by MiraK at 2:09 PM on July 7, 2020 [1 favorite]


That's a very cool comment, Thomas Pynchon. Makes me wonder, how might the idea of building an acceptance of heresies as a non-negotiable dogma of our culture fit into it? (no this is not self contradictory come on) 2a and 2b, sure, but what about option 2c, your flat earther culture tries to embrace the silly fools and obviously wrong heretics and even encourages people to challenge flat earthism? I don't just mean it in a reddity "let's debate everything" way. It's just that religion has been on my mind recently and I appreciate so much the tolerance for doubt - even the embrace of doubt. And of course science is built on this idea of tolerating or encouraging challenge.
posted by MiraK at 2:30 PM on July 7, 2020 [1 favorite]


mirak: eh i’m bored of doing insightful long-form replies if anyone needs me i’ll be in the starship troopers thread posting emoji-laden comments about space orgies.

okay okay you’ve sucked me back in i’ll stop posting eggplants for a second. one thing your comment reminded me of was something the whelk said a while back about faction struggles within the dsa: he said, if i recall correctly, that even though he’s not in the anarchist faction he supports the anarchists, because quote the anarchists keep us honest end quote. i like that statement a lot; that’s why i remember it like years later.

we can generalize that statement to the argument that a society that makes space for heresy is more honest than a society that doesn’t. ... but okay, not to get all dialectical or anything, but also consider that there are ideologies that appear to be wildly anti-adaptive for societies that harbor them. most pressingly, we are learning that we cannot live alongside antimaskism, antivaxxism, and white supremacy without putting millions upon millions upon millions of lives at risk. the liberal impulse to err toward accommodating heresy is admirable, but sometimes we have to fall back into the relatively illiberal position of suppressing murderously bad ideas by whatever means are most effective.

part of my journey from liberalism to leftism was acknowledging that the idea of the liberal public sphere, where we try to hash out ideas through abstract reason rather than material force, has sharp limitations. and may be misbegotten from the start, since meaningfully participating in the liberal public sphere has always required that the speaker or their statements be in some way endorsed by those with material (read: financial) power.

there comes a point where one must in some way make the dangerous jump into genuinely having the courage of one’s convictions, even though one can never be certain that one’s convictions are right. making this jump into ideological struggle rather than rational debate means admitting that although one desires to make debate space for some types of heresy and living space for some types of heretic, that in some cases material forces must in some way or another be arrayed in support of one’s own beliefs and against the beliefs of some others.

this is a terrifying claim. this is the reasoning deployed by inquisitors who have (they think) good reason to believe that tolerating infidels causes earthquakes, plagues, and threatens the mother church. it’s the reasoning that led lenin and trotsky to turn around and murder everyone to their left — to decide that anarchists were undermining the revolution rather than keeping it honest. it’s the reasoning crocodile-tearfully deployed by eugenicists.

the world is not safe and the world is not certain and we have no way to tell what ideologies we are morally and socially required to tolerate and what ideologies we are morally and socially required to struggle against. sometimes being nice is not being good. sometimes truly following what appears to be good results in staggering evil. and there is no certain rubric for telling in advance which situation is which.
posted by Reclusive Novelist Thomas Pynchon at 2:57 PM on July 7, 2020 [3 favorites]


tl;dr and maybe more to the point i think i’d bundle 2c into footnote 1. we can ourselves decide for whatever reason to be flatearthers or not be flatearthers, but we have little direct say in whether the flatearther society we find ourselves in tolerates non-flatearthers. we’re back in the position where we must persuade others toward our beliefs, and if we have the courage of our convictions we have to figure out the best way to actually carry out that persuasion.

thinking that society will be better if it tolerates heresy is (more or less) part of our shared epistemological frame. but i think that pushing society toward tolerating heresy requires material persuasion of some kind or another — actually demonstrating in one way or another that it is better to be tolerant, prefiguring a tolerant society, making in whatever way life better for people who choose to be tolerant — rather than just rational debate.

(and also, there appears to be some ideologies we really really truly can’t harbor, that we’ve got to fight against by whatever means are most effective. the effects of antimaskism are so deadly that we absolutely must have the courage of our convictions. if we don’t, we will become complicit in the deaths antimaskism causes.)
posted by Reclusive Novelist Thomas Pynchon at 3:37 PM on July 7, 2020


sometimes being nice is not being good. sometimes truly following what appears to be good results in staggering evil.

See: 1932 and liberals vs fascism. Fascists will use whatever liberal sympathies they can to maintain their legitimacy in the "MARkEtPlACE OF idEaS".
posted by Your Childhood Pet Rock at 3:43 PM on July 7, 2020


but also see what the bolsheviks did in kronstadt and to nestor makhno’s ukrainian anarchist army (i knew i could bring it back around to ukraine somehow!)

to quote el-p from run the jewels: life’s a shitnado.
posted by Reclusive Novelist Thomas Pynchon at 3:46 PM on July 7, 2020


But the statement's implication is we ought to disregard opinions which are accompanied by strong feelings

I think there are two constructions which can be put on the statement:

1. When people have strong feelings, usually they didn't get them from a deep understanding.

2. When people have a deep understanding, usually it doesn't create strong feelings.

How much you think each of these interpretations was intended probably would affect how you feel about the statement. I agree that the second interpetation is unjustifiable, but to me, it doesn't seem at all like that was the intended meaning, from the context. The subject under discussion isn't "how do people with deep understandings feel?". It's "do people with strong feelings know what they're talking about?". Which strongly suggests the first interpretation. And the first interpretation does not imply deep understanding acts against strong feeling. It could still be true even if every person with deep understanding felt strongly, as long as they were significantly outnumbered.
posted by mokey at 2:21 AM on July 8, 2020 [1 favorite]


The framing itself is the mistake because it comes from a set of values, a belief in Scientism of a sort, which as part of its belief set denies it is in fact a value and considering itself a detached "objective" point of view free of prejudice, which it is most emphatically not.

You can see that in the article where the author switches from talking about people allegedly not knowing about how toilets work, as if that itself wasn't a relative concept, to talking about the Affordable Care Act. The reason people feel strongly about the ACA has far less to do with any specific clause it contains and everything to do with the values that frame the discussion the ACA is just one part of.

The desire for social justice versus "free markets" are the animating concerns, which is why even if the ACA itself becomes a flashpoint for drawing out the collision of underlying ideologies. It isn't as if two Senate staffers, each completely informed of every line of the ACA couldn't have strong emotions about it, or that someone who didn't like some elements of it couldn't still feel strongly that it is at least better than some alternative, or someone who didn't know much of anything about the specific wording could still be informed enough about the general values to make a good choice between the supporting the ACA or keeping things as they were.

The framing and assumptions around underlying values and perspectives on how the world works are everything and often are necessarily based on desired outcomes of uncertain likelihood rather than "facts" alone, but Scientism likes to hold all that at a remove and pretend they can sit in distant impartial judgement and come up with answers that aren't affected by bias or desire, which is frankly ridiculous. (Not that taking a more detached view doesn't sometimes have benefits of course, just that pretending that is how things best work is the problem.)
posted by gusottertrout at 3:08 AM on July 8, 2020 [2 favorites]


Aardvark Cheeselog: there's a whole lot of quibbling about inconsequential stuff there

This strikes me as an unhelpful comment. It's vague and dismissive of other (unnamed) discussion participants without adding anything positive to the discussion.

along with some trying to make universal human traits to be really about white male fragility.

Followed by this statement, I suspect I could guess who the "some" are that you're referring to here are, and if I'm right, I think your statement is a mischaracterization of their arguments.

MiraK: their error just so happens to reinforce the male supremacist and white supremacist dogma unconnected "objective" "rational" third parties are the best judge of everything.

Agreed - it's easy to sound detached and objective if one is speaking from a place of privilege and doesn't feel that their lives, health or livelihoods (or those of their loved ones) are at stake.

It's also easy for someone in a place of privilege to use the statement in question as grounds for ignoring or discounting the strongly-held positions of those who are less privileged, simply because the positions are strongly held / passionately communicated.

MiraK: I have strong feelings because it affects me and it's a matter of life and death for me and I've lived all my life experiencing this and I know it?!

Thanks for the work you did here. It was painful to read some of the responses to your repeated explanations. I heard you, and I wish more people in this thread would have simply listened and thought about what you had to say.
posted by syzygy at 2:55 PM on July 8, 2020 [1 favorite]


« Older Art of Ant Farm: their enduring landmark, a mixed...   |   "Danny put his whole life aside to attempt to... Newer »


This thread has been archived and is closed to new comments