Join 3,522 readers in helping fund MetaFilter (Hide)

Neurosciencey stuff→Loss of critical faculties
August 24, 2012 1:06 PM   Subscribe

Oxford University neuroscience professor Dorothy Bishop delivers a scathing lecture (text version) about the overselling of weak neuroscience, both in the news and within the scientific literature.

Bishop blogs at BishopBlog.
Neuroscience controversy previously, previously.
posted by overeducated_alligator (22 comments total) 30 users marked this as a favorite

An hour long lecture and a 17 page PDF are a bit much to digest, especially for a layperson... is there a summary? The abstract isn't particularly scathing.
posted by desjardins at 1:54 PM on August 24, 2012

"is there a summary"

There's a conclusion that has some mild scathe:

In our current state of knowledge, it would be better to spend research funds doing well designed trials of behavioral treatment to establish which methods are effective, rather than rushing headlong into a functional imaging studies of unproven treatments."

posted by Phyllis Harmonic at 2:29 PM on August 24, 2012 [2 favorites]

From the intro, for an idea:

A review of six studies of neuroimaging correlates of language intervention found recurring methodological problems: lack of an adequate control group, inadequate power, incomplete reporting of data, no correction for multiple comparisons, data-dredging, and failure to analyse treatment effects appropriately. In addition, there is a tendency to regard neuroimaging data as more meaningful than behavioural data, even though is behaviour that interventions aim to alter.

A bit like what that fellow was doing at the New Yorker, right? Scraping for data to support a provocative premise and overselling legitimate science as applying more broadly and deeply than it should?
posted by BlackLeotardFront at 2:29 PM on August 24, 2012 [3 favorites]

I just read the paper (not every word, but I got the gist). It's essentially a takedown of six neuroimaging studies of interventions for language development. It does a good job of showing why the selected papers don't hold up to scrutiny. There is also some interesting evidence presented about how people respond to neuroimaging images, but that's actually mostly from the work of different folks (Weisberg et al, McCabe).

The point is well-taken, but I'd love to have seen a stronger paper. People are gaga for neuroimaging, and seem to think we know a whole lot of stuff we don't, just because someone somewhere was given an fMRI.
posted by OmieWise at 2:34 PM on August 24, 2012 [1 favorite]

This reminds me of the most awesome, insightful, and hilarious poster ever presented in the history of science (PDF)

Goddamn is it beautiful
posted by Blasdelb at 2:54 PM on August 24, 2012 [21 favorites]

How can I tell whether I agree with this article if I can't see what parts of my brain are lighting up while I read it?
posted by escabeche at 3:08 PM on August 24, 2012

This reminds me of the most awesome, insightful, and hilarious poster ever presented in the history of science

Actually that's when she really starts getting into the meat of it...after she mentions the fish.
posted by AElfwine Evenstar at 3:35 PM on August 24, 2012

is there a summary?

Yes: The Science of Bad Neuroscience
posted by homunculus at 3:36 PM on August 24, 2012 [1 favorite]

Already I've learned that the development of CT scanning was due to The Beatles. Money from Beatles record sales helped fund research by an EMI engineer.
posted by Obscure Reference at 4:16 PM on August 24, 2012

There was actually quite a bit of bad Neuroscience takedown in Delusions of Gender: How Our Minds, Society, and Neurosexism Create Difference.
posted by Foosnark at 4:32 PM on August 24, 2012 [3 favorites]

Good call on Delusions of Gender. It's full of quite specific examples of poor research, worse assumptions, and their negative impact on the real world.

It's also a great read.
posted by spectrevsrector at 4:44 PM on August 24, 2012 [1 favorite]

More previously. The dead fish study.
posted by Obscure Reference at 4:47 PM on August 24, 2012

Bishop is very polite, and she is cautious to avoid painting an entire field with the same brush, but the problems she is pointing out are widespread and she does an uncommonly good job of conveying clearly why those problems need to be taken seriously (regression to the mean is an especially pernicious phenomenon). Her talk is a blistering takedown not because it adopts a fire-and-brimstone tone, but instead because she's quietly pulling back one curtain after another, revealing the Great & Mighty Oz for the prestidigitator he is.
posted by belarius at 5:15 PM on August 24, 2012 [3 favorites]

Back in the 60's, my father, a psychiatrist, a behaviorist in that brief interval between Freud and Pharmaceuticals, sometimes sighed about the tendency for laymen to be impressed with brainwave studies. At the time, looking at Alpha states, etc., was considered to be pretty impressive stuff. A SCIENTIFIC analysis of consciousness.

Fast forward a half-century: same thing. Some of my best friends are journalists, but there are certain stories that sell, and they write these stories. If it bleeds, it leads: truer than ever.

Now: if the words neuroscientist or fMRI or MRI or neuroimaging (even better with pics!) are in there, it gets a kind of print cred that is good as gold.

Yes, I am also a sucker for this kind of evidence, sometimes. I can't help it. It just lights up my prefrontal cortex in sync with my limbic system in the same kind of way that dropping a Valium and a Vicodin can do.
posted by kozad at 5:42 PM on August 24, 2012 [1 favorite]

I wish people would get over the salmon study. It's ironic that it's trotted out as an example of how we shouldn't over interpret neuroimaging when it itself was wildly oversold and over-hyped. All it shows is if you take one session of pure noise, use a somewhat liberal threshold, you'll find junk. Most studies involve multiple subjects with noise being approximately randomly distributed and therefore highly unlikely to replicate across subjects. That and the other point it was trying to make about using multiple correction (and using false discovery rate in particular) was made much more cogently, and with less silliness, in the very well cited paper on FDR in neuroimaging some 8 years earlier (2200 citations and counting).

As for Bishop's talk, I also think it's being over hyped. It's very easy to take any field, pick papers published in low tier journals* and then say "hey look at the suck!". The trouble with Bishop is she's in that field and knows better to look in the rubbish bin.

*of the 6 papers, only 2 were in good journals, one was in PNAS and was the first of its kind, and did frankly have issues, which one of the authors (now a very respected neuroscientist) responded to on Bishop's blog saying it was rookie mistakes due to the early state the field was in. The other paper that was in a good Journal (Cerebral Cortex) Bishop actually finds to be ok.

The reality is there are shitty studies in shitty journals in every field. Digging through the detritus is like wading through a sewer and then complaining about how dirty the rest of the city is.
posted by Smegoid at 6:44 PM on August 24, 2012 [2 favorites]

Many studies are not conducted with enough intellectual rigor that the statistical significance of the results are actually false. She goes on to explain and give examples of how studies can be flawed by poor experimental protocol. In terms of her presentation she rips a few new assholes in existing studies that had been peer reviewed and cited in other papers.
One paper was cited over 240 times when the researcher studied 50 random papers she found only one paper that recognized the weakness of the study.

That is one hell of alot of bullshit based on bullshit. At best intellectual laziness at worst out and out incompetance and or fraud. Research monies wasted.
posted by pdxpogo at 8:11 PM on August 24, 2012

Smegoid, the poorly refereed journals for shitty neuroscience are ubiquitous. However, the grants for shitty neuroimaging somehow keep getting funded, the PI clowns and their students somehow keep getting access to those fantastically expensive machines to abuse, and somehow they keep getting not fired despite inspiring profoundly shitty news articles that are often not even misrepresenting them. There certainly arn't shitty journals in every field, and I can't think of many with the scale of shitty journals for cranks who couldn't statistically distinguish a quarter in a wind tunnel from pi that neuroscience has.

Maybe the stink seems normal if you sit in it long enough, and I don't envy the many folks doing good work, but the shit levels being complained about here are not normal - even for experimental psychology.
posted by Blasdelb at 8:36 PM on August 24, 2012 [1 favorite]

Lite-Brite phrenology, folks, Lite-Brite phrenology.
posted by benito.strauss at 10:24 PM on August 24, 2012

This is more of an indictment of functional imaging studies. Labeling neuroscience with such a broad brush is provocative, but not necessarily accurate.

Overselling isn't only confined to neuroscience.

Also, science stuff about the brain and stuff is sexy and easy to write provocative layperson/tabloid articles about.

Smegoid brings up an excellent point - but PNAS, if the principal (as opposed to first) author is a member, the peer review process is far far less rigorous, as far as what the grapevine says. I've not submitted a paper to PNAS.

As for Cerebral Cortex (CC) - they have an impact factor of 6.8-ish which is pretty darned good (compared to Journal of Neuroscience at 7.2-ish) but being from an R1 academic neuroscience research centre that regularly publishes in high impact journals exclusively far in excess of other universities in a comparable tier, I can't recall anyone submitting to CC. I can't recall when a CC article was last presented at any journal club or any visiting speakers who've annotated presented data which was published in CC.

Dicey science published in dicey journals? Dicey lay journalists misinterpreting/misunderstanding scientific literature?

Old news. How has Baroness Greenfield been doing lately, after a downfall? She was invited to and did a presentation and lunch at my research centre. Distaste for pizza aside (I don't blame her), she seemed extraordinarily close-minded in response to my questions about her belief that neural systems are necessarily "fast." Otherwise, I was not impressed with her science other than that she was able to get funding for it (and for getting it published?).

The problem with translating neuroscience research to non-specialists (much less non-scientists) is that it is incredibly complicated. It takes a lot an incredible amount of background in order to understand why new findings are important, which I feel is greater than for other fields. Another problem is that we don't know when people ask us how memories are stored. Cognition/memory/behavior is incredibly complex and no - there are no easy explainations, partially because we don't understand what's going on at physiological levels and partially because of the harmful meme that "we might one day zap memories away."

I'm having a hard time doing my "job talk" to non-neuroscience entitites where they limit the talk to 20 minutes. I end up spending as much time explaining the biology as my work; both in the presentation and during the question period. I need to work on that.

The structure of memory is completely unknown. We have no fucking idea (if you say LTP/LTD, I'm going to punch you in the face because you don't know either). Stop asking us that, we're doing (asking) that ourselves everyday. It's a hard problem, please stop bugging us about it. No, we can't "zap away" memories and even if we knew enough to, "zapping away" individual memories is likely not possible. Yeah, there're the betablockers and PTSD but I'm not entirely sold on the mechanism (especially the pkmzeta story), I think that there are better ways of achieving similar aims with different approaches.

Paradoxically, I think that if there was more funding and greater project funding percentage that there'd be less of these bad science being published.

The major cause for a certain segment of bad science may be that people (who think that they are good) aren't being funded. Some might feel the need to fake it, in order to get the funding to do proper science. I disagree strongly, but it's a thing. It's really hard to do good science without money; money used to come from the government, and if the government isn't spending on science and what it does spend on wants promises that you come up with results that coincide with said granting agency's agenda, well...
posted by porpoise at 11:18 PM on August 24, 2012 [2 favorites]

The reality is there are shitty studies in shitty journals in every field.

Yes, and what can happen is that someone reads the abstract, skims the paper, decides that it supports their argument (and besides, one wouldn't want to give room for reviewers to critique you for NOT citing it!) so they cite the bad study. Then other people cite it too. So "bad study in bad journal" becomes "Johnson et al. (2003) found that..." and the paper becomes "truth". So, when you write something that touches on the same topic, reviewers ask "but what about Johnson et al?" and you have to figure out how to accommodate the reviewer, because if you fight them, your paper might get delayed or not published at all.

Bad papers in bad journals ARE a problem. Also, bad papers in good journals.
posted by Philosopher Dirtbike at 11:28 PM on August 24, 2012 [1 favorite]

Lite-Brite phrenology, folks, Lite-Brite phrenology.

I can't remember who I heard this from, but my favorite nickname for functional imaging research is "haemophrenology". It's the term going around behavioral research circles that haven't drunk the kool-aid.
posted by Philosopher Dirtbike at 11:33 PM on August 24, 2012 [1 favorite]

Blasdelb: "This reminds me of the most awesome, insightful, and hilarious poster ever presented in the history of science (PDF)

Goddamn is it beautiful

See, I thought you were going to link to Japan's Phillips Curve Looks Like Japan.
posted by pwnguin at 9:36 AM on August 25, 2012 [3 favorites]

« Older From Paul Lukas of Uni Watch, a list of the 25 bes...  |  Google Street View has sent pe... Newer »

This thread has been archived and is closed to new comments