Problems at a Food Science Laboratory
February 12, 2017 5:57 AM   Subscribe

Brian Wansink has been mentioned previously on the blue. In a post on his own blog, he described the genesis of a series of four articles on pizza consumption in terms that sound very much like p-hacking. This has led to some problems.
posted by Pararrayos (36 comments total) 28 users marked this as a favorite
 
Color me very surprised that the original blog article has remained up (albeit with two heavy addenda at the top)
posted by the antecedent of that pronoun at 6:09 AM on February 12, 2017 [2 favorites]


Is it just me, or do the two addenda to the blog post just make these publications appear worse? Collecting a bunch of data to look for correlations is an OK technique to work towards a testable hypothesis, but you can't then back-test (or what he calls "pressure test") that hypothesis on your original data set, for obvious reasons.

Also, if they didn't go in to data collection with a hypothesis, as claimed in addenda II, then why did he originally state that the study had "null results"?
posted by muddgirl at 6:41 AM on February 12, 2017 [2 favorites]


Data issues aside, Wansink sounds like a huge tool.
posted by TheWhiteSkull at 6:48 AM on February 12, 2017 [14 favorites]


Oh god, looking at the actual article from the stats checkers, it looks even worse (if that's possible).

Page 5 has them talking about inclusion/exclusion across the four studies, trying to figure out why some subjects were included, and others weren't. It's not pretty.

This leads us to believe that the criterion for being “included in the analysis” of Article 4 was not eating at least 1 piece of pizza, but rather, eating no more than 3 pieces of pizza. This appears to be confirmed by the sum of the number of diners in each condition in Table 3 of Article 4. However, this criterion seems strange since the stated purpose of the study in Article 4 was to examine the extent to which “how much consumers pay for their food influences their perceptions about satiety, feelings of guilt and overeating” (p. 3). It seems rather illogical to exclude from such a study precisely those people who ate the largest amount of pizza. (We also wonder, in passing, why the authors assigned a special status to pizza, while apparently ignoring the effect that consumption of another high-calorie food such as pasta might have on diners’ feelings about overeating. That is, those who ate no or just one pizza slice might have eaten much more of other food types, which seems relevant for a study on overeating.)
posted by damayanti at 6:51 AM on February 12, 2017 [2 favorites]


Data issues aside, Wansink sounds like a huge tool.

Yes, I would love to see an interview with this supposedly lazy post-doc who was watching Game of Thones and going to spin class while only(!) published 2-3 papers in one year.
posted by muddgirl at 6:58 AM on February 12, 2017 [8 favorites]


This is everything that is wrong with science today. From the way he talks about his unpaid postdoc's race and gender to the way he derides the paid postdoc who decided to leave academia (with a mentor like him I have no questions about why she left) to the blatant disregard for the scientific process.... This shit is so pervasive, though. I was walking by a PhD stats class the other day and heard my colleague telling students to p-hack. He didn't use those terms, but he told them that if their hypotheses didn't yield significant results that they should "keep looking" for significance so that they could publish something. "You worked hard to get that data," he said. "There has to be something there."

This is why all the smart, talented academics are fleeing in droves. I don't know where they are going, but the academy is no longer the place for scientists who are occasionally wrong about their hypotheses. Publish or perish.
posted by sockermom at 7:05 AM on February 12, 2017 [24 favorites]


This - "This is why all the smart, talented academics are fleeing in droves." - is a bit too strong. Certainly some talented folks are leaving, or not even starting. But many are staying and starting, too. It's these folks who are yelling loudly enough so that established researchers consistently sound defensive, both in social media and in official outlets. The way stats courses are being taught is changing (source: I teach graduate statistics in a psychology department, and talk to others who do so as well). Change is slow and generational, but it's happening.

But yes, Wansink is terrible. There are hints of similar stats problems with earlier research from his lab, which I'd dig up if the wifi connection here at this bagel joint weren't so crummy.
posted by anaphoric at 7:33 AM on February 12, 2017 [9 favorites]


Some fields of research are more corrupted than others, some institutions are worse than others and even within a university there can be good and bad institutes.
This is really an ugly case, and I'm really disappointed with Cornell.
posted by mumimor at 8:01 AM on February 12, 2017 [2 favorites]


Very not surprised. A lot of these types of studies are clearly garbage even from the methods section of their peer reviewed papers, it's easy to believe some are even less sound when you consider the selection bias and cherry picking that can be done and left out of the methods section. I always assume that he worst of even food related epidemiology unless it's been repeated a thousand times.
posted by midmarch snowman at 8:23 AM on February 12, 2017 [3 favorites]


The last time Brian appeared on the blue, I wrote a post defending the type of research he does. While I was at Cornell, I knew Brian and thought highly of him. However, this time I can't defend him and I won't. He should release the data, correct the data, or retract the studies. If there are errors in his other studies, they should also be corrected or retracted. He won't be the first behavioral researcher with statistical problems, and he should accept the lumps and move on with a focus on being more careful next time. (Also his posts about the different post-docs is disappointing and doesn't reflect on the person I once knew.)
posted by fremen at 8:36 AM on February 12, 2017 [8 favorites]


P. Z. Myers on Wansink here.
posted by TedW at 8:40 AM on February 12, 2017 [2 favorites]


I feel sorry for the PhD student whose name is on these papers. Even if she knew the papers were fatally flawed - which is far from a given, for a student - she was advised badly, by someone with a lot more experience than her. And the way that Wansink writes about the post-doc who refused the data set makes it clear what kind of pressure she might have faced. In that kind of situation, it's really easy to second-guess yourself.

And he's quite possibly tanked her career with his bad advice. This is what will come up when people search her name, now.

I don't know, I just get the feeling that she was used.
posted by Kutsuwamushi at 8:45 AM on February 12, 2017 [20 favorites]


the wifi connection here at this bagel joint weren't so crummy.

More like crumb-y amirite?

I got nothin'.
posted by howfar at 8:49 AM on February 12, 2017 [2 favorites]


The problem with PZ Myers's take is his fixation on FAILED data.

The data are the data; they are no more 'failed' than a human is 'illegal'. The problem is in what you do with them, not the data themselves.
posted by Dashy at 8:54 AM on February 12, 2017 [3 favorites]


I'm adding this to my "p-values are bad and science should abandon them" pile.
posted by biogeo at 9:00 AM on February 12, 2017


This is disappointing. I've enjoyed the various write-ups of that lab's research that I have read (largely from links here, I think), but now I am wondering which of those were based on solid research and which weren't.
posted by Dip Flash at 9:00 AM on February 12, 2017 [1 favorite]


The data are the data; they are no more 'failed' than a human is 'illegal'. The problem is in what you do with them, not the data themselves.

I agree with the sentiment that "the data are the data." But there is such a thing as a failed experiment/study. Not one that produces a negative result; that is simply a result, one that unfortunately is much harder to publish in today's scientific climate despite being potentially valuable. Rather, some (unfortunately many) studies are simply poorly designed, such that the datasets they produce are fundamentally incapable of providing useful information, positive or negative, to test a hypothesis. The most common reason for this is probably lack of sufficient statistical power. Data collected from such studies are "failed" in that they are essentially useless: one might as well have never collected them in the first place.

I think moving away from rewarding researchers for producing "significant" p-values, and instead creating an expectation that there will be a discussion of confidence intervals, effect sizes, and model comparisons, shifts the focus away from whether the study produces a "positive" or "negative" result under traditional hypothesis testing, and toward whether the study is "successful" in the sense of providing meaningful information about the possible hypotheses in question. It also encourages researchers to be more explicit about whether they are doing exploratory science or discriminatory science, rather than pretending that even exploratory studies are driven by well-posed hypotheses.
posted by biogeo at 9:22 AM on February 12, 2017 [19 favorites]


Chiming in here to agree that bagging on a former postdoc is monstrously shitty. Speaking as someone who never broke out of the postdoc cycle and eventually left academia because of that, I see at least three huge issues here:
(1) You just don't know what will cause someone to leave academia. In my case, I had a good job at a good institution for my second postdoc, but I also had a 4-4 teaching load and nobody else in my field to talk to nearby. I didn't leave because of "Facebook, Twitter, spinning class (??)" - I left because I wasn't able to get research done in that environment.
(2) Even if there is reason to believe that someone left academia because they were too focused on outside life, is that an inherently bad thing? Not everyone wants to spend 12 hours a day teaching, grading, and writing papers. If you do, great! More power to you. I didn't.
(3) The inherent power imbalance in a well-known researcher mocking a former postdoc who left academia is obvious, and it speaks very poorly of this asshole. I hope that anyone who was planning to work with him reads this and makes another decision, because the kind of person who would write a public post dragging a postdoc is not the kind of academic you ever want to associate with.
posted by Frobenius Twist at 11:52 AM on February 12, 2017 [22 favorites]


Wansink's behavior merits a Bonferroni-style punishment: the nth time he recalculates a p value, it should be multiplied by n.
posted by Mapes at 12:23 PM on February 12, 2017 [4 favorites]


If anyone here is wondering why female scientists disproportionately leave the field, consider how you might expect a boss like this one, for whom exercise and recreation apparently indicate a lack of career seriousness, would treat an employee who decided to have kids...
posted by en forme de poire at 2:02 PM on February 12, 2017 [13 favorites]


Also, foreign-born grad students and postdocs are particularly susceptible to being exploited by PIs because often they 1. lack the cachet of a North American/European degree, often necessary in order to be taken seriously in the field, and 2. can't easily leave a lab or program because of visa issues.
posted by en forme de poire at 2:09 PM on February 12, 2017 [10 favorites]


I would never treat a postdoc or a graduate student like this, nor would I talk about them this way in public. I expect them to have a life outside of mathematics and I don't expect them to work on a problem they don't think is interesting or important just because I like it.

I've had three postdocs and every one of them now holds a tenure-track job in a Ph.D.-granting institution. I've had 12 Ph.D. students and every one of them is employed, all but one in academic jobs.

You don't have to live this way to succeed in academia.
posted by escabeche at 3:08 PM on February 12, 2017 [10 favorites]


Let's see who he follows on Twitter:
Other people
Peter Thiel
Other people

Ok I'm done
posted by Yowser at 3:57 PM on February 12, 2017


Wansink sounds like a complete ass. On the other hand, can people stop treating transphobic, anti-science "science journalist" Jesse Singal as The Best Of The Web?
posted by adrienneleigh at 5:39 PM on February 12, 2017 [1 favorite]


Jesse Singal has been one of the only journalists to chart the rise of the alt-right . How bad is he really?!

I'm genuinely curious. I follow both you and him on Twitt for what it's worth.
posted by Yowser at 5:52 PM on February 12, 2017


Yowser,

I am looking for an accessible, non-inside-baseball roundup for you. But the short version is: yes, he's really bad. He has used his considerable platform for the benefit of TERFs and in favor of conversion therapy for trans people, and he blocks basically every trans person on Twitter who tells him that this is hurtful and wrong and unscientific. (And most of the cis people who do the same thing. Unless they're other Important White Dude Journalists. Then he just ignores them.)

When I find the roundup, i will MeMail you so as not to further derail the thread?
posted by adrienneleigh at 6:03 PM on February 12, 2017 [7 favorites]


Wansink has blogged about his recent experiences:

http://www.brianwansink.com/phd-advice/statistical-heartburn-and-long-term-lessons
posted by anaphoric at 10:08 AM on February 13, 2017 [2 favorites]


Wow, anaphoric, that follow-up post almost makes you feel sorry for the guy:
For 20 years I’ve tried to give helpful guidance to grad students based on the mistakes I made when I was in their shoes: lost my PhD funding, wrote a “Mickey Mouse” dissertation (my advisor’s words), and was “a job market embarrassment” to my department.

I’ve also tried to give helpful guidance to junior profs – again, based on the mistakes I had made when I was in their shoes: 11 consecutively rejected papers, lowest teacher ratings in the school, gay rumors of me on page 1 of the MBA newspaper, denied tenure, lived out of suitcases for 3 years as a visiting professor, rejected again for a tenure track gig, and so on.
And now he can add "object lesson in bad science" to the list. I wonder how much the early label of "job market embarrassment" twisted his approach to mentoring students. He seems to have latched onto, "It's publish or perish, so publish any way you can," as his mantra.

And now, in the face of yet more criticism about yet another thing he's done, he says he's determined to learn yet another lesson and fix the methods of his lab. (And, he hopes, the methods of the whole field of behavioural research. Ambitious.) The man is a sucker for punishment. Or, as he puts it, "painful lessons".
posted by clawsoon at 11:01 AM on February 13, 2017 [3 favorites]


So it turns out that the featured blog post was his very first in his PhD Advice series. The second post was The Craziest Writing Secret No One's Told You, about a journalist-turned-researcher who succeeds in getting papers written and published in "two months start to finish," in part because he avoids "detailed analyses that were superfluous to the main story line."

I didn't used to know why scientists looked down on "pop" scientists. This One Weird Blog is opening my eyes.
posted by clawsoon at 12:10 PM on February 13, 2017 [2 favorites]


I can only feel sorry for this guy in the same way I feel sorry for the sympathetic villain in a melodrama. Maybe he had experiences that pushed him towards doing shoddy science - but the way he's treated his post-doc and graduate student are just terrible and there's no excuse.

Plus, an update on Andrew Gelman's blog contains a past incident that shows that his promises to fix the methods of his lab aren't really worth much.
posted by Kutsuwamushi at 6:53 PM on February 13, 2017 [4 favorites]


I keep thinking about the unselfconscious way that Wansink presented his advice. He sincerely believed that he was doing good science, and that this was good advice to pass on. What's compelling about that is that he seems to have learned everything he knows about doing science from what's rewarded by the system.

To do good science, you: Work your ass off. Keep trying and trying; eventually you can squeeze some insight from your data. Do research that gets attention. Above all, publish. Publish, publish, publish. Anything that leads to more publications is good science.

Do all those things, and you can lead a respected lab at a name university and get appointed as the Executive Director of the USDA's Center for Nutrition Policy and Promotion.

Because Wansink's views have been so purely shaped by the incentives offered by the system - by tenure committees, peer-reviewed journals, and granting agencies - then if the system has an intelligence of its own, Wansink's understanding of what makes good science is a clear reflection of the system's understanding of what makes good science. Most individual scientists recognize it as bad science, but the system itself (if you're willing to consider it a sort of superintelligence) thought Wansink was doing wonderful science.
posted by clawsoon at 7:55 AM on February 14, 2017 [4 favorites]


That follow-up post once again reveals more about this academia bro than he realizes. If your advisor tells you that your dissertation is "Mickey Mouse," that's a failure on the part of your advisor, not you. It also means that your advisor is an asshole. Crucially, though, it seems that Wansink has internalized the toxic mentality of his advisor and has decided that failures by grad students and postdocs lie entirely on their shoulders and not on the people advising them. This is a terrible mindset and, once again, indicates that nobody should ever work with this dude.
posted by Frobenius Twist at 8:04 AM on February 14, 2017 [4 favorites]




omg people are calling it "Pizzagate", BYE
posted by en forme de poire at 11:21 AM on February 14, 2017


For those who stop by to browse again, the hits just keep coming:

http://steamtraen.blogspot.fr/2017/02/a-different-set-of-problems-in-article.html

It's an interesting time in history to be a scientist, that's for sure.
posted by anaphoric at 6:16 AM on February 16, 2017 [2 favorites]


For repeat viewers, this just keeps getting worse:

https://www.theguardian.com/science/head-quarters/2017/mar/02/fresh-concerns-raised-over-academic-conduct-of-major-us-nutrition-and-behaviour-lab

Nick Brown has done the lion's share of the detective work here. It's pretty depressing.
posted by anaphoric at 8:42 AM on March 2, 2017


« Older Far more than a simple translation   |   An incalculable pleasure Newer »


This thread has been archived and is closed to new comments