"Be skeptical. But when you get proof, accept proof."
August 27, 2013 5:08 PM   Subscribe

 
Before you begin reading, take note of the authors and their institutional affiliations. Some institutions (e.g. University of Texas) are well-respected; others (e.g. the Discovery Institute) may appear to be legitimate research institutions but are actually agenda-driven. Tip: google “Discovery Institute” to see why you don’t want to use it as a scientific authority on evolutionary theory.

Always worth bearing in mind - provenance is important and the authors may have an agenda based on their funding.
posted by arcticseal at 5:15 PM on August 27, 2013 [3 favorites]


This is great - thank you! The tip about reading the abstract last was a good one. I read lots of things published on SSRN (which are not "hard" science papers) and various research papers and coming from a non-scientific background, this post is very helpful.

It reminded me of a paper I printed out a while back about statistical literacy and have only gotten halfway through. It's long but worth the read (in my non-expert opinion):

Helping Doctors and Patients Make Sense of Health Statistics (PDF)
posted by triggerfinger at 5:31 PM on August 27, 2013 [7 favorites]


Oh man, I need to save this link. It shall be the new "Let Me Google That For You."

I disagree with one of the commenters at that link about the value of non-scientists learning to deep-read scientific papers. I don't think that critical thinking is a magic trick taught in grad school - it's something anyone can develop with time. I often want to dissect papers in fields I have absolutely no knowledge in, like genetics, especially when they are in the news. I don't think that doing so makes me capable of being a geneticist, but I can do it well enough to get a general sense of how much "spin" is being applies by the press release. That seems valuable to me.
posted by muddgirl at 5:36 PM on August 27, 2013 [2 favorites]


Any time the words “significant” or “non-significant” are used. These have precise statistical meanings. Read more about this here.

Good piece. I remember running into a situation here on Metafilter where a paper's findings were misinterpreted until I had to explain the difference and how that changed understanding of the entire paper. We could use more basic statistics training in school.
posted by Blazecock Pileon at 5:41 PM on August 27, 2013 [3 favorites]


Good piece. I remember running into a situation here on Metafilter where a paper's findings were misinterpreted until I had to explain the difference and how that changed understanding of the entire paper. We could use more basic statistics training in school.

Was it this?
posted by John Cohen at 5:49 PM on August 27, 2013


No, it was another thread.
posted by Blazecock Pileon at 5:59 PM on August 27, 2013


This is a good link for people who are already pro-science and interested in learning more but it won't do anything to change the minds of anti-climate changers or anti-vax dummies. They didn't arrive at their positions through reason and reason won't convince them to change their minds.
posted by Justinian at 6:14 PM on August 27, 2013 [1 favorite]


I so need to go snarf up a sockpuppet with some variant of "sciencebarbie" to make the appropriate eponysterical comment:

Science is Hard!
posted by sammyo at 6:26 PM on August 27, 2013


As you read, write down every single word that you don’t understand.

Whew, yes, really excellent blog entry, next week an example! But in a field outside of significant expertise, just a lot of (incredibly worthwhile) work.

(although I expect even this good advice is moot for most serious math or theoretical physics papers)

((and actually even knowing words you don't actually know the meaning of (scientifically or in a particular field) could be a bunch of research, the example of "significance" being a significant example.))
posted by sammyo at 6:36 PM on August 27, 2013 [3 favorites]


sammyo: (although I expect even this good advice is moot for most serious math or theoretical physics papers)

I'm not sure it'll help with data papers in most hard sciences. I can't imagine understanding them without a strong working knowledge of whatever subfield they are in, although you might still learn something by reading them. Generally it feels like non-scientists would be best served by secondary sources in most fields, although there are certainly exceptions where they would want to turn to the primary literature.
posted by Mitrovarr at 6:42 PM on August 27, 2013


Any time the words “significant” or “non-significant” are used. These have precise statistical meanings. Read more about this here.

Yep. Precise statistical meanings, relative to an arbitrarily chosen and hotly contested p-value...
posted by Jimbob at 6:44 PM on August 27, 2013 [5 favorites]


I can hear my non-sciency friends going "TL;DR" when I consider sending this to them.

I believe #s 5 & 6 are expecting waaaaay too much from non-scientists. Most attacks I see within the various communities come from these two points e.g., "That method/study design requires a larger sample size" or "They should have used a bi-linear Markovian non-linear blah-blah instead to analyze those data." Heck, most scientists would struggle with number 6....

Rule number one: Show me the data.

Everything else follows....
posted by CrowGoat at 6:49 PM on August 27, 2013


I dunno. I guess I did take basic science classes, but I've successfully used this technique to analyze "data papers" in hard sciences that I'm not at all familiar with, such as this paper which explored which genes determine whether a dog is long-haired or short-haired. Before I started reading closely it looked like a jumble of nonsense jargon. After analyzing it, I couldn't definitely state, say, that it's unlikely that anyone will breed a long-haired greyhound through selective inbreeding, but I have a better understanding of the factors involved.

I think that most people, if they really wanted to, could figure out many "data papers." I think the impediment isn't any specific scientific knowledge, it's desire or motivation.
posted by muddgirl at 6:53 PM on August 27, 2013 [2 favorites]


I would add paying attention to base rates and effect sizes. This tends to be the biggest confusion / omission in popular press stories on science.
posted by srboisvert at 6:54 PM on August 27, 2013 [2 favorites]


(Not that I think more people should do this - it does take quite a bit of time that could probably be better spent elsewhere. I just wish they would do it in some cases, such as on Wikipedia when people try to use scientific papers as support for "facts" that the paper doesn't actually support.)
posted by muddgirl at 6:55 PM on August 27, 2013


Rule number one: Show me the data.

Rule Zero is "give the IEEE or ACM $20", sadly.
posted by mhoye at 7:22 PM on August 27, 2013 [4 favorites]


The abstract? May or may not have anything to do with any of the experimental findings.
posted by telstar at 7:43 PM on August 27, 2013 [1 favorite]


I would love to see such discipline applied by the climate change crowd, instead of the typical emotional "there's a consensus, you duped denier."
posted by Home Treadmills at 7:49 PM on August 27, 2013 [1 favorite]


I would love to see such discipline applied by the climate change crowd

The "climate change crowd"? You mean the people who are actually...you know...doing scientific research and publishing papers on climate change? I don't understand what you're trying to say here...

I agree, however, that science isn't advanced by "consensus". It's advanced by evidence supporting hypotheses. Which climate science has plenty of.
posted by Jimbob at 7:53 PM on August 27, 2013 [5 favorites]


I wish a lot of science 'journalists' would follow this advice.
posted by empath at 8:12 PM on August 27, 2013 [4 favorites]


Also: by my rough estimation anywhere from 1/6 to 1/4 of papers will have fundamental errors, often of interpretation or statistics or even arithmetic, which lead to a different conclusion than the ones they have put forward.

The point about understanding the method is very important too. Far too often steps are left out, sometimes intentionally.
posted by bonehead at 8:17 PM on August 27, 2013


I would love to see such discipline applied by the climate change crowd, instead of the typical emotional "there's a consensus, you duped denier."

Perhaps rule 0.1 should be understanding the differences between scientific papers and editorials.
posted by srboisvert at 8:38 PM on August 27, 2013 [5 favorites]


muddgirl: I think that most people, if they really wanted to, could figure out many "data papers." I think the impediment isn't any specific scientific knowledge, it's desire or motivation.

I guess it depends on the paper. That paper seemed pretty simple and straightforward to me (but then again, I specialize in genetics). However, not every study is like that. I know I've tried to read astrophysics papers and bounced off when it was clear that I had neither the vocabulary or math to make head nor tail of it. And a lot of papers in biology assume a high level of background knowledge of the organism/molecular pathway/ecological system, without which the meaning of the results will be lost.
posted by Mitrovarr at 10:29 PM on August 27, 2013


This is a good link for people who are already pro-science and interested in learning more but it won't do anything to change the minds of anti-climate changers or anti-vax dummies. They didn't arrive at their positions through reason and reason won't convince them to change their minds.

This is possibly too harsh - I've encountered various anti-(insert hobby horse) people who are otherwise sane and logical people, in addition to those who are going on gut feeling alone.

The problem is they have different premises. They don't trust or believe 'big pharma' and/or the government, usually due to some personal event which may or may not have been misinterpreted.
So when a consensus is available that conflicts with their personal view, no matter the evidence, it is dismissed as they simply don't trust the sources not to be manipulating the evidence. At all. They go to their trusted small band of 'mavericks' who reinforce their belief that they're being lied to by Big Science/government/pharma.

Think of it this way; you see a new paper that claims to poke big holes in vaccination theory. You see it being touted by Jenny Mccarthy, and was funded by an anti-vaccination think tank. Your go-to science blog (say, Bad Science) calls it garbage. Are you really going to bother reading, researching and dissecting the paper yourself? Even if you did, are you going to try and disprove its claims as you go?

Now, in this case, you'd have plenty of reasons to ignore it; vaccination theory is pretty well understood, has a wealth of evidence to back it up, and the vast concensus is that vaccination is a good thing. Extraordinary claims require extraordinary evidence etc.

But if we're honest, most of us go by argument by authority by default, simply because we wouldn't have enough time in the day to do anything else if we didn't do that for the vast majority of what we accept as true.

So in amongst the gut worshippers, the woo merchants and the lunatic fringe, there are a significant number that at least think they are using reason. Poorly, maybe, but they're not going to be reachable by calling them loons either, because then they just dismiss YOU as another sheep fooled by Big Science. They well may be being manipulated by some wealthy industry with an agenda to protect their own profits (tobacco and oil being two such examples), but trusting the wrong sources means merely whipping out science articles we found compelling won't do it.

And this is a problem, not because I care about what they think particularly, but they often are a roadblock to achieving sufficent political concensus to actually tackle our big problems. Well funded anti-conventional science is itself a big problem as a result, and we need to find a way to tackle it before we all drown together.
posted by ArkhanJG at 12:52 AM on August 28, 2013 [9 favorites]


Actually, I think one of the best things about scientific literacy is that it lets you smell a rat in papers you didn't suspect. Once you learn some things about the basics of the way science works and read a fair number of scientific papers, you start to be able to catch these things. You start to detect the stink of woo, you catch things like small samples sizes or circular reasoning, and you spot data sets that are 'too perfect' to have actually existed in the real world. I've definitely seen papers where a person could go in with only routine caution, and see the problems well enough to relegate it to the circular file.

So more scientific literacy would probably help, since people would read the papers in their field and realize that all of them are either suspicious or garbage. At the very least it could help rescue the sincere advocates that exist at the medium levels, below the bribed ex-scientists and outright con-men that make up the publishing tier, and the idiots who fill up the lower end without really knowing anything about the issue. Clearing up the middle level would probably help a lot, because they add a lot of credibility, and form a lot of the support for the movements.
posted by Mitrovarr at 1:18 AM on August 28, 2013 [1 favorite]


Rule number one: Show me the data.

If you don't know the methods, the data is useless.
posted by eriko at 5:51 AM on August 28, 2013 [3 favorites]


Always worth bearing in mind - provenance is important and the authors may have an agenda based on their funding.

I was thinking about this earlier because I do this instinctively yet it seems to rub uncomfortably against the issue of provenance by authority that Sokal and others caution against when evaluating claims.

I think within a field of experts with the training and experience necessary to grok the methodological soundness of an experiment or theory and weigh its implications against current understanding, that caution is absolutely valid and necessary. Scientists should evaluate scientific claims based on the standards and rigors of scientific inquiry and not on reputations.

However, I'm a layman, and often unqualified or incapable of making merit-based evaluations about subjects above my head. As such, my responsibility is to understand as much as I can about the subject at hand to the best of my ability and to distinguish rigorous thinkers/writers/experimenters from their less rigorous peers and give their evaluations more weight, as long as those evaluations are always evidence-based and clearly articulated.

The second part is key. I can't learn to understand the math at the core of many of my favorite subjects, but I can teach myself to evaluate theoretical arguments in a rigorous manner and learn to spot specious reasoning.
posted by echocollate at 6:48 AM on August 28, 2013


I'm surprised the author (of the linked piece) did not suggest reassessing the data/methods in light of the reader's agreement with the interpretations or recommendations of an author's (of an article/book) based on their presented findings.

Because nothing helps me see the flaws in an article/book like reading the main body and thinking 'Hmmm, well, this is a very interesting piece of that puzzle--you could interpret this a number of ways.' And then reading the conclusion and thinking 'WTF, Dr Smith? You're drawing grand conclusions and making sweeping policy recommendations based on shakey assumptions and evidence not in the record.'

And then I go back and can see where they are (consciously or not) choosing a methodology or data to support a foregone conclusion.
posted by K.P. at 10:08 AM on August 29, 2013


Man, I can't help thinking that the previous FPP is the natural bookend to this post.

I've been reading a bunch of articles about climate change and the denial thereof, and just finished a lengthy conversation about an "alternative" therapy and why it's not evidence-based medicine. It seems clear that the folks who believe the stories about why climate change is a hoax, or who fight tooth and nail to hold on to their faith in a baseless therapy, will never be convinced by the fruits of the scientific method. On the other hand, there's the whole discussion about Science as a belief system and the pitfalls of scientism. I have faith in science as a kind of Platonic ideal - it's a belief system that tries to provide its believers with the tools to test their own truths. But reading all these arguments on the Internet makes me wonder if anyone is ever going to be convinced by the acres of verbiage posted, the endless carefully thought out responses, and the breathtaking wingnuttery around every corner.
Borges imagined the takeover of reality as a nightmare. Having seen the world consumed by the "intoxicating symmetries" of Ideology in the thirties and forties, he had every right to. But what if we took at Tlön as a guidebook instead of a warning? More and more, memoirs feel spent and novels exhausted. Why not try a third way? Call it Tlön, after Borges’ imaginary planet. Tlön isn’t really a genre, but a set of strategies.
Is there a third way, or are we just waiting for these two worlds posts to cancel each other out?
posted by sneebler at 5:19 PM on August 29, 2013


I like the link to disreputable journals.. but they forgot the Journal of Universal Rejection!
posted by chapps at 9:25 PM on August 29, 2013


"Yep. Precise statistical meanings, relative to an arbitrarily chosen and hotly contested p-value..."

...Which is always explicitly stated with the statistical tests used so that you can you can judge for yourself the significance of the author's significance if you know how.
posted by Blasdelb at 8:18 AM on August 30, 2013


"I like the link to disreputable journals.. but they forgot the Journal of Universal Rejection!"

There must be a typo in here somewhere, the Journal of Universal Rejection can be quantitatively demonstrated to be the most reputable journal. It only meets a couple of the criteria used, and only in trivial ways, while unlike other journals one can trust absolutely that it has never published research that is false, doctored, unethical, unsuitable, dangerous, or even honestly misleading.
posted by Blasdelb at 8:25 AM on August 30, 2013 [1 favorite]


How to Read and Understand a Scientific Paper: The Musical

(Sung to the tune of "developers, developers, developers" by Steve Ballmer).

"Methodology, methodology, methodology, methodology! Methodology, methodology, methodology, methodology! Methodology, methodology, methodology, methodology!"
posted by gd779 at 2:38 PM on September 9, 2013


« Older I Spit On Your Realities   |   "Fire the flopper." Newer »


This thread has been archived and is closed to new comments