Can a single conversation change minds on divisive social issues? No.
May 20, 2015 6:43 AM   Subscribe

A field experiment conducted by UCLA Political Science graduate student Michael LaCour made big news (including a This American Life Episode) when LaCour and Columbia professor Donald Green published their paper in Science about how a 20 minute conversation with gay canvassers change many people's minds and led them to support same-sex marriage. It turns out though, that LaCour made the whole thing up.

Anomolies in the data were discovered by two graduate students at Berkeley. Their report is here (PDF warning).

After being notified of these issues, and after LaCour was unable to explain them away and admitted to falsfiying the data, Donald Green submitted a retraction to Science.

The Berkeley graduate students, David Broockman and Joshua Kalla decided to contact the survey firm that LaCour used in the field experiment. When they did, what they found shocked them:

"The survey firm claimed they had no familiarity with the project and that they had never had an employee with the name of the staffer we were asking for. The firm also denied having the capabilities to perform many aspects of the recruitment procedures described in LaCour and Green (2014)"



Writeup from Buzzfeed.

Green's retraction letter to Science:
" I write to request a retraction of the above Science report. Last weekend, two UC Berkeley graduate students (David Broockman, and Josh Kalla) who had been working on a research project patterned after the studies reported in our article brought to my attention a series of irregularities that called into question the integrity of the data we present. They crafted a technical report with the assistance of Yale professor, Peter Aronow, and presented it to me last weekend. The report is attached. I brought their report to the attention of Lynn Vavreck, Professor of Political Science at UCLA and Michael LaCour’s graduate advisor, who confronted him with these allegations on Monday morning, whereupon it was discovered that he on-line survey data that Michael LaCour purported to collect could not be traced to any originating Qualtrics source files. He claimed that he deleted the source file accidentally, but a Qualtrics service representative who examined the account and spoke with UCLA Political Science Department Chair Jeffrey Lewis reported to him that she found no evidence of such a deletion. On Tuesday, Professor Vavreck and Michael LaCour for the contact information of survey respondents so that their participation in the survey could be verified, but he declined to furnish this information. With respect to the implementation of the surveys, Professor Vavreck was informed that, contrary to the description in the Supplemental Information, no cash incentives were offered or paid to respondents, and that, notwithstanding Michael LaCour’s funding acknowledgement in the published report, he told Professor Vavreck that he did not in fact accept or use grant money to conduct surveys for either study, which she independently confirmed with the UCLA Law School and the UCLA Grants Office. Michael LaCour’s failure to produce the raw data coupled with the other concerns noted above undermines the credibility of the findings.

I am deeply embarrassed by this turn of events and apologize to the editors, reviewers, and readers of Science."

Previously.

This American Life transcript.
posted by MisantropicPainforest (196 comments total) 48 users marked this as a favorite
 
Politico link.
posted by MisantropicPainforest at 6:53 AM on May 20, 2015


I don't understand how he thought he wasn't going to be caught, let alone the huge ethical hole needed to completely falsify a study.
posted by demiurge at 6:54 AM on May 20, 2015 [5 favorites]


People are so weird sometimes. If you are going to lie and cheat, why not be smart about it?
posted by Dip Flash at 6:55 AM on May 20, 2015 [2 favorites]


I always wonder why people do that stuff. Part of me feels sorry for the grad student, even though I understand that he made those choices every step of the way - it's not like one little decision among many was bad.

It also makes me wonder just how many other similar papers are fake - what if this guy felt empowered to fake his stuff because there's more of a culture of fakery than we know?
posted by Frowner at 6:56 AM on May 20, 2015 [7 favorites]


I can assure you there is absolutely no culture of fakery.

That's part of the reason this got as far as it did--it simply would not enter someone's mind that their colleague, co-author, student, professor, etc. was a pathological liar.
posted by MisantropicPainforest at 7:00 AM on May 20, 2015 [22 favorites]


There is always the complaint that nobody in academia ever tries to replicate anything, because there's not much reward. Even in this case, will help these grad students' careers?
posted by vogon_poet at 7:02 AM on May 20, 2015 [1 favorite]


I always wonder how many actually continue to get away with it.
posted by notyou at 7:02 AM on May 20, 2015 [4 favorites]


See Retraction Watch for more juice. "Culture of fakery" is a bit strong, but it's not an isolated incident either.

People respond to rewards.......All I can say is......the world rewards fakers. Science and academia is no different; if I were to hazard an educated (!) guess, there is more fakery than people assume, but much less than most other human enterprises.
posted by lalochezia at 7:02 AM on May 20, 2015 [12 favorites]


is that bad
posted by (Arsenio) Hall and (Warren) Oates at 7:12 AM on May 20, 2015 [1 favorite]


Well, we know that TAL is really good at doing retraction episodes, so I guess I look forward to this one.

It is a shame that all of this fell apart so soon after the airing of the story, obviously a lot of the investigation was happening at the same time that TAL was reporting/prepping the story for air. I wonder what caused TAL to miss this grumbling -- were the investigative efforts kept that far under wraps?
posted by sparklemotion at 7:12 AM on May 20, 2015 [3 favorites]




Massaging the numbers a little, that I can imagine, but concocting a whole study out of thin air?
posted by Flashman at 7:13 AM on May 20, 2015 [1 favorite]


This makes me unspeakably sad & angry, though I'm glad it came out so quickly.

On the bright(?) side, that PDF by Broockman/Kalla/Aronow reporting the irregularities they found in LaCour was produced with R markdown / knitr, which means that the PDF isn't just text; it itself is the output of the analyses it reports. The source code for producing the PDF is available on Broockman's website & linked from within the report. This means readers don't need to blindly trust Broockman/Kalla/Aronow's figures; they can reproduce them themselves. That's a lovely bit of icing.
posted by Westringia F. at 7:19 AM on May 20, 2015 [51 favorites]


There's also been a movement to require study authors to publish their raw data, at least in the life sciences and physical sciences. It seems like the only obstacle there is the fear of getting "scooped" on a finding.

But here there's a mention of a survey firm called Qualtrics. I don't know much about it, but are there paid licensing agreements or something that would make open data more difficult?

On preview: knitr is fantastic, so I guess at least some people are interested in this.
posted by vogon_poet at 7:24 AM on May 20, 2015 [2 favorites]


I thought even the false story (as reported in TAL) was weird. In the fake story, the guy who was being interviewed said that on a scale of "zero to 10, where 10 is definitely vote for gay marriage and zero is definitely vote against, he's a five". So even in the fake story what we saw was that someone squarely on the fence about an issue can be persuaded to one side or the other? It struck me as a strange example to open with if the theme is that people's beliefs can be swayed.
posted by Sangermaine at 7:25 AM on May 20, 2015 [4 favorites]



I can assure you there is absolutely no culture of fakery.


Maybe not, but after this and the illegal Montana experiment also out of Stanford, the university's IRB and general counsel should be having a general "come to Jesus" meeting with the poli sci department there.
posted by Jahaza at 7:26 AM on May 20, 2015 [1 favorite]


Maybe not, but after this and the illegal Montana experiment also out of Stanford, the university's IRB and general counsel should be having a general "come to Jesus" meeting with the poli sci department there.


? The authors of the original study were from Columbia and UCLA; one of the debunkers was from Stanford (with the other two from Berkeley and Yale).
posted by damayanti at 7:30 AM on May 20, 2015 [19 favorites]


Maybe not, but after this and the illegal Montana experiment also out of Stanford, the university's IRB and general counsel should be having a general "come to Jesus" meeting with the poli sci department there.


I see Columbia and UCLA, no Stanford affiliation among the authors.

(One of the debunkers is now at Stanford GSB.)
posted by grobstein at 7:30 AM on May 20, 2015 [3 favorites]


Jinx
posted by grobstein at 7:31 AM on May 20, 2015 [1 favorite]




I can assure you there is absolutely no culture of fakery.

Massaging the numbers a little, that I can imagine, but concocting a whole study out of thin air?

I know at least one person who is accused of faking an entire study. It hasn't been resolved yet, as far as I know, so I don't want to say very much about it. But although I think faking is wrong, and I dislike the person accused on a personal level, I find it hard to be upset/outraged/saddened/angry/disappointed/whatever by what this person may have done. In my field, there isn't a culture of fakery, but there is a culture of promoting positive results that encourages fakery, and some people give into that pressure.

Pretty much all my studies have ended in something like "Well, that didn't work how I wanted it to, but here's what we can learn from the failure anyway." It's important, theory building work in a design research style. From my point of view, every study we can learn something from is a success. But studies like mine (let's learn from what didn't work) are a lot harder to get published than studies that make headlines. Every time I submit something the reviewers and editors each individually need to be convinced of the value of reporting a "failed"* study. From other points of view, a study is only a success if the "treatment" "works."

It's not that I blame the journals for it either, really. Journals care about their reputation for publishing groundbreaking research, too. They have to if they want to be taken seriously. Nothing is more important for the success of a journal than reputation.

But explaining the value of my work over and over again does get tiring. I like what I do, and I don't really want to change it, but I can see how it would be less exhausting to fake a study, get all sorts of recognition, and a fancier job with it. Some people give into that temptation. Not the temptation of prestige (although that's part of it), but the temptation to not feel tired all the time. To feel valued instead of dismissed.

If there's any blame to go around, it's on publish or perish culture. As long as prestige, salary, job security, etc, and most importantly self-image depend primarily on publications and grants, fakery is going to happen.
posted by yeolcoatl at 7:32 AM on May 20, 2015 [51 favorites]


? The authors of the original study were from Columbia and UCLA; one of the debunkers was from Stanford (with the other two from Berkeley and Yale).

Ah, yes... my confusion.
posted by Jahaza at 7:33 AM on May 20, 2015


Wow.
posted by latkes at 7:42 AM on May 20, 2015


Not that Green should be held responsible for LaCour's malfeasance but it's surprising to me that someone can "co-author" a paper without having seen or participated in the data analysis.
posted by Matt Oneiros at 7:45 AM on May 20, 2015 [7 favorites]


without having seen or participated in the data analysis

Where did you see this?
posted by MisantropicPainforest at 7:47 AM on May 20, 2015


Holy fuck, wow. This is actually kind of scary. What an ASSHOLE.
posted by a hat out of hell at 7:54 AM on May 20, 2015 [1 favorite]


> There's also been a movement to require study authors to publish their raw data, at least in the life sciences and physical sciences. It seems like the only obstacle there is the fear of getting "scooped" on a finding. But here there's a mention of a survey firm called Qualtrics. I don't know much about it, but are there paid licensing agreements or something that would make open data more difficult?

In fact, open-data is what allowed this to be uncovered so thoroughly in such short order. The LaCour dataset had been openly available on openICPSR, a repository for social & behavioral science data. Broockman/Kalla/Aronow's report states that the data has since been removed, though they have both a screenshot of when it was active and copies of the data they retrieved as part of their source bundle.
posted by Westringia F. at 7:56 AM on May 20, 2015 [10 favorites]


Not that Green should be held responsible for LaCour's malfeasance but it's surprising to me that someone can "co-author" a paper without having seen or participated in the data analysis.

I don't know what happened on this paper. But in my experience, when a grad student and a faculty member coauthor a paper like this, it's common for the senior author to participate in the data analysis in a fairly hands-off way.

The senior collaborator's role often ends up being to play devil's advocate or sort of kick the tires on the project — a little like a miniature version of a dissertation advisor's role. So they look for flaws in the methods that the junior collaborator plans to use, they suggest more appropriate statistical techniques, they push them to consider alternate interpretations and refine their argumentation, and so on. All that stuff counts as ways of contributing to the data analysis in a paper, which means that it's reasonable and not dishonest to make the senior collaborator a coauthor. But it can all be done without the senior author actually doing any of the data-gathering or performing any of the calculations themself. And it means that if the junior author just flat-out lies about the methods they're using or the results of their calculations, the senior author may well not be in a position to notice.
posted by nebulawindphone at 8:01 AM on May 20, 2015 [23 favorites]


This always seemed like a weird claim. Consider all of the closeted right-wing pols who remained unwilling to support gay marriage even after hours and hours of vigorous gay sex.
posted by DirtyOldTown at 8:05 AM on May 20, 2015 [56 favorites]


Kieran Healey:

"As my cousin Ronan Palliser reminded me this morning, this week sees Ireland enter into the final week of campaigning on a constitutional referendum that, if it passes, will make Ireland the first country in the world with a constitutional right to same-sex marriage enacted by popular vote rather than judicial decision. Things are looking good for the Yes campaign at the moment, which is remarkable considering how historically conservative the country has been... As reported in the Irish Times, the results of the LaCour and Green paper were used as a template by those organizing the Yes campaign"
posted by MisantropicPainforest at 8:06 AM on May 20, 2015 [4 favorites]


Also, I am delighted to learn that Retraction Watch is a thing that exists, and sad to see that it appears to be down at the moment.
posted by a hat out of hell at 8:06 AM on May 20, 2015 [1 favorite]


Wasn't there another experiment, with Planned Parenthood on abortion? Were those findings faked too? (Sorry if that's addressed in the first link, it's not working for me)

I have to admit, as someone who works in advocacy, I'm both vindicated and disappointed by these results. Vindicated because Ira Glass was typically smug about why campaigns don't use tactics like this (because it's "easier to run an attack ad" which is such an oversimplification) which was annoying. Disappointed because I had really wanted to try to find a way to try this tactic on my current issue campaign.
posted by lunasol at 8:07 AM on May 20, 2015 [1 favorite]


Fuck this dude.

In the very beginnings of western science one was often required to swear an oath before God as on your honor as a Catholic gentleman attesting to the good faith and honesty of your reports before they would be published, and obviously the specifics have changed but the basic concept remains intact. Science still inherently relies on the idea of a deeply sacred fucking trust, that absofuckinglutely must be sacrosanct if only for what happens when its violated.

This piece doesn't really have any bitter human interest angles but the wreckage that these fuckers always leave behind can get immense, especially when they get this big. When fuckers falsify exciting data like this, people often instantly dedicate their lives to replicating the methods that created it hoping to expand on it and careers shift instantly to support them. Earnest replication of excitingly falsified data leads to wasted years for nothing of value, nothing groundbreaking, no publications, and only technical skills that remain marketable, which derails innocent promising young academic careers forever, breaks the backs of innocent established careers, destroys marriages, and creates alcoholics. Can you imagine what it must be like to be the kind of young researcher who does the real work of science these days spending years making the sacrifices inherent to a career in science, living at the ass end of what can often only euphemistically be called a work\life balance, studiously documenting your endless frustration, only to find out that your hell - because real despair cannot exist without hope - was dictated from the beginning by the blind pointless vanity of some asshole right as you're just getting started? Can you imagine being a Primary Investigator responsible for those kinds of fucked young careers? This shit seriously fucks people up in ways that don't always mend.

I once knew a post-doc who, as a graduate student, had his whole lab totally fucked over by attempting to follow up on the fraudulent results of a graduate student in a different lab, which they closely collaborated with. The two PIs were totally devastated, but the graduate students decided they needed some kind of ritual to process the event. They printed out the offending emails and took the two fraudulent papers out of their files, and made some kind of paper mache doll thing to burn it in effigy in the woods. I'm told it was cathartic.
posted by Blasdelb at 8:07 AM on May 20, 2015 [49 favorites]


Wasn't there another experiment, with Planned Parenthood on abortion? Were those findings faked too?

Yes, and probably. It should be noted that Broockman and Aronow, who are authors on the report exposing this, are co-authors with LaCour on other projects.
posted by MisantropicPainforest at 8:09 AM on May 20, 2015 [6 favorites]


People fake studies just often enough to make trouble for the profession. Replication really is academia's nerdy, but perfect best friend who always tells the truth and never steers you wrong, as we have seen here.

Journalism has fact-checkers and I always thought academia could vastly improve on that concept and have their own version of it.

People may not always flat out make things up like this case, but you need someone to keep you just humble and honest enough to prevent the profession's credibility from taking a hit.
posted by Alexandra Kitty at 8:14 AM on May 20, 2015 [1 favorite]




It's interesting that TAL has been bitten by this kind of thing twice (that we know of). I wonder if it will lead to a reappraisal of what kind of stories they pursue or how they source them.
posted by chrchr at 8:21 AM on May 20, 2015 [5 favorites]


Is there a culture of fakery in Academia?

Let me put it like this:

You are 24 years old. You have taken out $150,000+ in non-dischargable loans to become a PhD.

To become a PhD is, literally, your last chance to join the American middle class. This is literally it. If you don't do this, you will literally be in bondage to your debtors and working for slightly above minimum wage for the rest of your life.

Your future career -- not just the attainment of your PhD, but where you land a job -- is almost totally dependent upon producing new, "exciting" and interesting research that is published in the most well-known journals.

Even publishing good quality but not-very-exciting research is bad. "We tested the following hypothesis but found them to be incorrect..." is bad. Why? It's not exciting. It's not "fresh". It doesn't pop out on a resume. It generally doesn't make the New York Times.

In this position, you literally have every motivation in the world to cheat. Literally every motivation. You literally have nothing to lose except your credibility: and credibility isn't going to pay those student loans.

People often ask: why do they do it? And why do they do it so poorly? Why was their deception so easily discovered? Aren't they supposed to be smart?

The only answer that makes sense to me is that they are too desperate to really care. Once desperation takes hold, it becomes extraordinarily easy to make objectively bad choices and do dumb things. To not even think of covering your tracks.

Young academics are very smart, very well motivated people who are literally chasing their last chances to ever make more than minimum wage. They will do, and say, anything to stay ahead of the game. They have nothing to left to lose, because the American dream was already lost long ago.
posted by Avenger at 8:22 AM on May 20, 2015 [58 favorites]


This makes me so sad. The story on TAL gave me a lot of hope that the key to people thinking I am a human being with rights was just talking to them. Being personal and vulnerable and human. I'm really sad that seems not to be the case.
posted by stoneweaver at 8:26 AM on May 20, 2015 [13 favorites]


You have taken out $150,000+ in non-dischargable loans to become a PhD.


Well this didn't happen here.

But really, your structural argument, that American academics at top-10 departments are just struggling to join the middle class and burdened by student loans, therefore resort to gross levels of dishonesty and fraud, is silly, since now LaCour will be out of a job and out of a career.

They will do, and say, anything to stay ahead of the game.

This is a pretty gross thing to say. And how does it predict young academics being the ones to discover this fraud?
posted by MisantropicPainforest at 8:29 AM on May 20, 2015 [10 favorites]


It should be noted that Broockman and Aronow, who are authors on the report exposing this, are co-authors with LaCour on other projects.

Well, that makes their drive to re-analyze this data fucking fascinating. Was it really just this study that made them suspicious? What experiences had they had with LaCour that they figured this would be worth their time? Did he just seem sketchy? Personally or professionally? Wow.
posted by maryr at 8:31 AM on May 20, 2015 [18 favorites]


In fact, open-data is what allowed this to be uncovered so thoroughly in such short order. The LaCour dataset had been openly available on openICPSR, a repository for social & behavioral science data.

Yes, and the smoking gun here is the LaCour dataset's similarity to the CCAP dataset, which is not generally publically available and they could only access it because another study used part of it and made that data available. LaCour may have used the CCAP data as the basis of his fraudulent data on the assumption that it would be very difficult for anyone to do this kind of re-analysis.
posted by penguinliz at 8:32 AM on May 20, 2015 [1 favorite]


But really, your structural argument, that American academics at top-10 departments are just struggling to join the middle class and burdened by student loans, therefore resort to gross levels of dishonesty and fraud, is silly, since now LaCour will be out of a job and out of a career.

Again, when people are desperate they don't consider these outcomes. If there is a 90% chance that I will be discovered and revealed as a fraud, versus a 10% chance that I won't be discovered, I'll take the 10% chance, because there is a 100% chance that I will lose everything if I don't produce amazing, New York Times headline-level research for my dissertation.
posted by Avenger at 8:33 AM on May 20, 2015 [1 favorite]


This makes me so sad. The story on TAL gave me a lot of hope that the key to people thinking I am a human being with rights was just talking to them. Being personal and vulnerable and human. I'm really sad that seems not to be the case.

We don't actually know that. All we know is that the LaCour study didn't prove that it was the case because the data appear to be faked. It's still possible that a properly performed study would demonstrate a real effect.
posted by jedicus at 8:34 AM on May 20, 2015 [7 favorites]


there is a 100% chance that I will lose everything if I don't produce amazing, New York Times headline-level research for my dissertation.

No, there isn't.
posted by maryr at 8:34 AM on May 20, 2015 [19 favorites]


No, there isn't.
posted by maryr at 8:34 AM on May 20 [+] [!]


Try telling the cheaters that? I don't believe it -- but maybe they do?
posted by Avenger at 8:35 AM on May 20, 2015


If they've been in academia for more than a couple years they should have learned that good science isn't always flashy science and they ought to be pretty damn skeptical of studies that make the front page of the New York Times.
posted by maryr at 8:38 AM on May 20, 2015 [17 favorites]


Again, when people are desperate they don't consider these outcomes.

If that's the case, then why is a guy from a top 10 department with a stellar diss committee, interest from notable people like Don Green, and a handful of publiciations or polished working papers (excluding the fabricated one) making data up, and the people from top 50 departments--who have a much smaller chance of making shit up for a job--not doing this?

there is a 100% chance that I will lose everything if I don't produce amazing, New York Times headline-level research for my dissertation.

This isn't even remotely true.
posted by MisantropicPainforest at 8:38 AM on May 20, 2015 [13 favorites]


To be fair, LaCour was adamantly against falsifying the experiment initially. I was surprised at how quickly I was able to convince him to do it.
posted by snofoam at 8:39 AM on May 20, 2015 [32 favorites]


The hell of it is, it feels like this would be true on an anecdotal level. The premise, if I remember from the TAL piece, was that the conversation wouldn't necessarily be a matter of "blah blah blah I shall impart dry facts upon you to change your mind" but was a more personal approach ("Mr. Opponent to Gay Marriage, how do you feel about your own marriage? Good, right? Would you want that for other people? People like me?"). And I do know, and have seen, people change their minds on certain issues when they consider them from an alternate perspective like that. Not all the time, but....once in a while.

Maybe that was what lead everyone to accept this as plausible - it maybe wasn't happening on the scale being reported, but everyone reading this could think of at least one instance of "oh yeah, this is kind of like how my Uncle Hank changed his mind about having kids out of wedlock when cousin Marcie actually did that", and thought "yeah, I'll buy this."
posted by EmpressCallipygos at 8:43 AM on May 20, 2015 [3 favorites]


People often ask: why do they do it? And why do they do it so poorly? Why was their deception so easily discovered? Aren't they supposed to be smart?
Or maybe we only catch the ones who cheat poorly.
posted by fullerine at 8:43 AM on May 20, 2015 [7 favorites]


Blasdelb! Good to see you back. And yeah, the collateral damage from things like this is immense. Hopefully the speed with which this was overturned minimizes it.

maryr: Well, that makes their drive to re-analyze this data fucking fascinating. What experiences had they had with LaCour that they figured this would be worth their time?

It seems that they were genuinely impressed and were trying to extend it (pretty much the beginnings of the scenario Blasdelb described) when they discovered it was all a mirage. At that point, and especially if they'd previously published together, it's better to get out ahead of the story with as rigorous a take-down as possible. It's insurance, in a way.

Not to mention they were probably mad as hell.
posted by Westringia F. at 8:43 AM on May 20, 2015 [21 favorites]


Avenger: I do think you're overstating things. It is absolutely possible to get a tenure track job without doing total rockstar newsworthy research. Most people who get academic jobs are not rockstars in that sense. It's just that unless you're a total rockstar, you will not be able to count on getting a job.

And that uncertainty — the lack of a relatively solid career path for non-rockstars — is still really shitty, and still makes people do crazy fucked-up things, including (rarely) this sort of cheating. But there's no need to overstate the situation.
posted by nebulawindphone at 8:43 AM on May 20, 2015 [14 favorites]


Again, when people are desperate they don't consider these outcomes.

If that's the case, then why is a guy from a top 10 department with a stellar diss committee, interest from notable people like Don Green, and a handful of publiciations or polished working papers (excluding the fabricated one) making data up, and the people from top 50 departments--who have a much smaller chance of making shit up for a job--not doing this?


Desperation isn't always rational. What seems like a sweet position to you and I may seem that way to someone else.

there is a 100% chance that I will lose everything if I don't produce amazing, New York Times headline-level research for my dissertation.

This isn't even remotely true.


Of course it isn't true. But if it feels that way to a young academic, it's true to them, isn't it?

I don't understand the hostility here. Academics are inspired to cheat because the system they live in severely punishes not just failure -- but also simple mediocrity. The system punishes those who even do good work, but don't make it "exciting" enough.

Also, please don't thread-sit your own thread.
posted by Avenger at 8:44 AM on May 20, 2015 [2 favorites]


Retraction Watch has now crashed due to traffic.
posted by blucevalo at 8:45 AM on May 20, 2015


I've read enough anon askmes from desperate grad students/post-docs to understand the tunnel vision that helps this kind of fakery along, but jesus fucking christ people who fake this kind of thing just make me all spluttery.
posted by rtha at 8:48 AM on May 20, 2015 [6 favorites]


Avenger, if your argument was about how things *feel* to a young academic, you should have said that. Of course it feels like the weight of the world is on your shoulders. No one is arguing with that. But you stated these things as if they were facts, not feelings. That's why they are being repeatedly refuted.
posted by maryr at 8:53 AM on May 20, 2015 [5 favorites]


The hell of it is, it feels like this would be true on an anecdotal level.

Right! Contrary to the title of this post, and as in the case of the previous TAL retraction, the story doesn't really change. Just that the specific facts promoting the story are false. There are abuses at electronics factories in China. There's good evidence for that. The specific examples were made up. In this case, maybe you can change someone's mind by having the kind of conversation described in the study. There's still data to be gathered here.
posted by chrchr at 8:53 AM on May 20, 2015


Avenger, if your argument was about how things *feel* to a young academic, you should have said that. Of course it feels like the weight of the world is on your shoulders. No one is arguing with that. But you stated these things as if they were facts, not feelings. That's why they are being repeatedly refuted.
posted by maryr at 8:53 AM on May 20 [+] [!]


I apologize for not being clearer.
posted by Avenger at 8:55 AM on May 20, 2015 [6 favorites]


This makes me so sad. The story on TAL gave me a lot of hope that the key to people thinking I am a human being with rights was just talking to them. Being personal and vulnerable and human. I'm really sad that seems not to be the case.

The idea that a 20 minute conversion would change anyone's mind about anything, especially something as polarizing and politicized as gay marriage, is... naive. BUT, there's ample evidence that minds have been changed gradually (to neutral if not positive) by interacting with gay people (especially friends or family) over time, coming to empathize with them, eventually seeing them as fellow human beings worthy of equal legal protections.

Of course some people will never come around, whether because of a sincerely held if superficially ridiculous moral/ethical justification, or because they're just insecure homophobic assholes afraid of being surprise-speared by roving bands of gays. But really, who needs their approval?
posted by echocollate at 9:02 AM on May 20, 2015 [1 favorite]


"Remaining Uncertainties ...

The data for the abortion study reported at
http://www.cis.ethz.ch/content/dam/ethz/special-interest/
gess/cis/cis-dam/CIS_DAM_2015/Colloquium/Papers/LaCour_2015.pdf
in LaCour (2015) is not currently publicly available"

Note to moderators: that is a direct quote. Please don't mung the included link.
posted by hank at 9:09 AM on May 20, 2015 [1 favorite]



We don't actually know that. All we know is that the LaCoeur study didn't prove that it was the case because the data appear to be faked. It's still possible that a properly performed study would demonstrate a real effect.


The fact it could not be replicated points to a more definitive answer.

And that's not a bad thing. I don't want to be swayed by arguments. I want to be moved by getting to know people on a long-term basis. Science is about testing theories on a long-term basis based on experiments, not by appealing to authority or argument.

Isolation and sticking to your designated philosophical in-group warps and distorts reality and that damage takes years -- it takes a long-term commitment to open perspectives. Emotional investment does more than a quick intellectual debate.

I am always wary of the instant, and I would have doubted the veracity of this study because it goes against human nature. Thoughts and feelings are intricate and complex and take years to form -- the whole "just add water" theory is an unintended insult to human nature.

If people could just stop being afraid of everyone and everything, the world could be paradise, but no quick fixes in that regard.
posted by Alexandra Kitty at 9:14 AM on May 20, 2015 [2 favorites]


There's also been a movement to require study authors to publish their raw data, at least in the life sciences and physical sciences. It seems like the only obstacle there is the fear of getting "scooped" on a finding.

When you're studying people, like in this case, there's also a real privacy concern with sharing raw data. It's surprisingly hard to anonymize data sets, especially as other data sets become available in the future. (E.g., Netflix's movie-review dataset was anonymous until researchers started matching it up with IMDb reviews.)

On the other hand, I've heard anecdotally that privacy can be a get-out-of-jail-free card to avoid data-sharing requirements. Once you've invested time to build a data set, you want to be the one to mine it for further papers, so it's tempting just to say "sorry, privacy!" if that's plausible. (No idea how prevalent that is.)

Probably you end up needing an independent panel like an IRB to balance the needs of privacy and transparency. Not sure how far publications or institutions have gone with that idea so far.
posted by jhc at 9:15 AM on May 20, 2015 [1 favorite]


I don't understand the hostility here. Academics are inspired to cheat because the system they live in severely punishes not just failure -- but also simple mediocrity. The system punishes those who even do good work, but don't make it "exciting" enough.

The hostility is that this is not a universal truth which you have presented vigorously as the experience of young academics, nor is it markedly different than any number of other professions for which exciting success is the only way to succeed.

A bar I used to frequent years ago was an owner's life's work - all his savings, his house leveraged, all that. He was found to be watering his drinks down because he wasn't making much money because bars are competitive.

The response from his clientele put him out of business. On a human level, I empathize with how much he lost and how hard the bar business is, but by no means am I going to sit here and say it's the system that made him choose to fuck people over. Lots don't.
posted by buoys in the hood at 9:19 AM on May 20, 2015 [3 favorites]


I have voted on more searches than I care to think about (probably more than 20?) in two different PhD-granting departments that are not in the top tier with the departments we're talking about here (Stanford, Stanford GSB, UCLA, Columbia) and are in relatively undesirable locations to most people.

I think you have the psychology of this a little wrong. The students in these top-ten-ish departments are already hot shots to some degree; that's how they got in. They should know (and in my limited experience tend to know) that students at top-ten departments who do research typical of top-ten departments are very likely to get hired by a top-50-but-not-top-ten department. These are grad students who are confident enough in their own ability to get hired that they often simply don't apply to my department(s)*, which is pretty hard to reconcile with being so desperate as to fake data. So while obviously it can happen, because it did, I think it takes a more unusually-broken person than you're describing.

*And this is a pretty reasonable thing for them to do, honestly.
posted by ROU_Xenophobe at 9:29 AM on May 20, 2015 [12 favorites]


Probably you end up needing an independent panel like an IRB to balance the needs of privacy and transparency. Not sure how far publications or institutions have gone with that idea so far.


So, for things like NSF grants (which cover the social sciences), there's been a requirement for comprehensive data management plans in the grant applications. E.g., from the linguistics requirements, emphasis mine:

All proposals must include as a supplementary document a plan for data management and sharing the products of research. The data-management plan to be submitted with a proposal must be no longer than two (2) pages in length and must be included as a supplementary document. In preparing their data-management plans, applicants should address all five of the points specified in Chapter II, Section C.2.j of the Grant Proposal Guide and the comparable section of the NSF Grants.gov Application Guide. Applicants are especially encouraged to specify how they intend to make data, software, and other products of the research readily available to potential users through institutionally based archives, repositories, and/or distribution networks so that the products may be easily accessed by others over long time periods.


And, as a study design typically includes "...and here's what we'll do with the data when we're done", IRBs are set up to review plans for making data publicly, or at least, semi-publicly, accessible, and I know all my consent forms currently have a "I consent to have my data shared with other researchers; it will be made anonymous to the extent possible" line in them.
posted by damayanti at 9:31 AM on May 20, 2015


We use Qualtrics at work for surveys and whatnot. It's pretty easy to make your data public with their tools.

I'm trying to think of something where the price has gone up as much as it has for secondary education. Other than academic journals, which are expensive beyond the bounds of reason. But universities just stop subscribing to them instead of passing the costs on to students. (That was always the saddest part of the year when I worked in a university library, what journals do we have to cut THIS year?)
posted by fifteen schnitzengruben is my limit at 9:32 AM on May 20, 2015


The idea that a 20 minute conversion would change anyone's mind about anything, especially something as polarizing and politicized as gay marriage, is... naive.

This is true in the abstract, but from my experience in persuasion campaigns, the majority of people aren't as firm in their position as they may claim. In today's information climate, most people are aware of the issues, they just have never given themselves the space to really think about it. Direct conversations are effective tools to get people to question themselves and consider new ideas. My experience is of course anecdotal, and I understand this study is not necessarily looking at the same things, but I think the past several successful same-sex marriage campaigns are, at least on the surface, clear evidence of this strategy's potency.
posted by Think_Long at 9:34 AM on May 20, 2015 [1 favorite]


Probably you end up needing an independent panel like an IRB to balance the needs of privacy and transparency. Not sure how far publications or institutions have gone with that idea so far.

That's not what the IRB is or does, though—the IRB is to keep the university from getting sued, nothing more, and certainly not about research methods, design or replication. My most recent social-science IRB reviewer, for example, has no research experience to speak of, is under 30, and no higher degree than a B.A. in finance. But this person is definitely "qualified," from the university's standpoint, to judge studies that s/he does not have the proper research training to understand.
posted by migrantology at 9:40 AM on May 20, 2015


I think my favorite (not really my favorite) part is how undergrad-y all his attempts to get away with it have been. LaCour: "Qualtrics lost the data! Or I deleted it! It was a glitch!" Qualtrics: Nope. LaCour: I have it all in my files! Somewhere! Probably! I will be vindicated!

I can't tell you how many first year college students, when backed into a corner, promise that if they just had time to EXPLAIN then you would UNDERSTAND and once I find the PAPER...

(My favorites are the ones in online classes, who try to insist that it is an error in the software that resulted in their work failing to come through. Fun fact: the software tells us you haven't logged in since September, not even to download the lectures, let alone the assignments.)

LaCour will show you all! (No he won't.)
posted by a fiendish thingy at 9:43 AM on May 20, 2015 [20 favorites]


I don't understand the hostility here. Academics are inspired to cheat because the system they live in severely punishes not just failure -- but also simple mediocrity. The system punishes those who even do good work, but don't make it "exciting" enough.

I think you're misunderstanding what professional dorks like me find exciting and, frankly, the level of competition for political science jobs.
posted by ROU_Xenophobe at 9:54 AM on May 20, 2015 [4 favorites]


And another maddening repercussion -- RetractionWatch says "According to his website, LaCour will become an assistant professor at Princeton University in July. [Update: As of 8 a.m. Eastern on 5/20/15, that mention had been removed from his site, but it is still available on the Google cache version.]" which means that someone was passed over for a job because of him.
posted by Westringia F. at 10:01 AM on May 20, 2015 [5 favorites]


I'm trying to think of something where the price has gone up as much as it has for secondary education.

Nit: secondary education is high school, as distinct from primary education. Colleges and universities are tertiary.

Actual substantive comment: The sticker price for university education has shot up dramatically. The actual prices students pay have been rising, but much less dramatically.
posted by ROU_Xenophobe at 10:08 AM on May 20, 2015


And if you pursue post-doctoral studies in mathematics that's quaternionary education.
posted by cortex at 10:10 AM on May 20, 2015 [11 favorites]


Don Green is on LaCour's dissertation committee. I think it is safe to say LaCour will not be granted a PhD.
posted by MisantropicPainforest at 10:11 AM on May 20, 2015 [8 favorites]


Ira Glass responds. TAL has retracted their story.
posted by shepard at 10:15 AM on May 20, 2015 [1 favorite]


Replication or it didn't happen.
posted by ethansr at 10:17 AM on May 20, 2015 [4 favorites]


I'm just glad I can go back to never talking to people I disagree with again.

(This whole thing actually makes me very sad, so I'm joking because that's what I do.)
posted by MCMikeNamara at 10:18 AM on May 20, 2015 [8 favorites]


It's simpler just to never speak to anyone ever.
posted by ROU_Xenophobe at 10:20 AM on May 20, 2015 [2 favorites]


Blasdelb! Good to see you back. And yeah, the collateral damage from things like this is immense. Hopefully the speed with which this was overturned minimizes it.

A close friend of mine had his PhD undermined, as it relied on techniques that were later shown to be based on falsified research. It derailed his career.

That said, I probably could have falsified my history research data, and no one would be the wiser. The parchment was almost too dirty to read, covered in coal dust - no one but me would be stupid enough to try to use them :)
posted by jb at 10:21 AM on May 20, 2015 [2 favorites]


TAL has retracted their story.

I don't see that this is the case, and really, I don't see how TAL has anything to "retract". They reported on the published research and that reporting accurately conveyed what was published. Now they're reporting (also accurately) that this research has been retracted. This is very different than the Apple case, where the TAL correspondent was falsifying what he represented as original investigative journalism.
posted by mr_roboto at 10:30 AM on May 20, 2015 [6 favorites]


This makes me so sad. The story on TAL gave me a lot of hope that the key to people thinking I am a human being with rights was just talking to them.

Both of my parents flipped out when my sister got engaged to another woman. My father got ahold of a pamphlet of some kind, with the "born this way" message, and did a complete 180. My mother's objections were much more intransigent.
posted by StickyCarpet at 10:36 AM on May 20, 2015 [2 favorites]


These are grad students who are confident enough in their own ability to get hired that they often simply don't apply to my department(s)*, which is pretty hard to reconcile with being so desperate as to fake data.

My experience is that sloppy work is not so much the result of desperation, but of confidence - confidence so extreme it's past chutzpah. In my field (history), it's not faking data that tends to be an issue, but making conclusions from inadequate data, just a few sources, ignoring the details -- and making big sweeping conclusions at that.

Sadly, this is being rewarded. Those who are being hired are not always the most careful, the ones with the impressive techniques (though occassionnally they slip through), but the ones who make big, news-headline grabbing claims -- and the more prestigious the university, the worse the situation seems to be. One of the worst offenders is Niall Ferguson - professor at Harvard, who maybe did some solid research early in his career. But now he consistently bases his work on misinterpreting select sources; when people who know that subject/those sources better than he does question his work, he just dismisses him.

I've even had a full professor at an Ivy League make historical claims that I, as a student sitting in his lecture, knew were false. He was a 20th century specialist whose understanding of pre-1870 history was practically non-existent, but refused to admit so. But that didn't stop him publishing books with claims about the past that were wrong. (One of the claims is that the Armenian genocide was the first genocide ever; not only do the ghosts of the Tasmanians want a word with him, but also the people slaughtered in ~80 BCE (in Dalmatia, I believe) for being Latin speaking - and who were killed by the anti-Roman king at the time for basically the same reason as the Armenians - they were ethnically connected to an enemy power. The other claim was that WWI was the first European "total war" - which was maybe true for the British (though the Civil War could be argued about), but certainly not true for the French, which had the levee en masse (sp?) during the Revolution.
posted by jb at 10:39 AM on May 20, 2015 [4 favorites]


Researchers who create studies out of whole cloth don't keep me up at night, since they will be found out eventually. The "nudgers" are the ones that worry me. Take a little more background in your gel quantitation, treat the control mice a little nicer, and voila, your systematic noise is now a positive result, and you have all the backing data to support your claim. If it's not reproducible, just chalk it up to strain differences or honest errors.

A biotech company where a friend worked was acquired by a larger entity who diligently went through the process of re-hiring all the employees, and subjected them to background checks. A non-insignificant number of the employees were found to have faked their PhD credentials, since they came from other countries where verifying credentials was moderately difficult/expensive to do for an unskilled HR person.

If nobody is looking under the rocks, the worms are happy.
posted by benzenedream at 10:43 AM on May 20, 2015 [16 favorites]


The fact it could not be replicated points to a more definitive answer.

The paper was retracted because the original data was faked and no actual survey happened. I don't think anyone has actually tried to run the survey for real.
posted by BungaDunga at 10:46 AM on May 20, 2015


Ira Glass responds. TAL has retracted their story.

The linked post doesn't read as a retraction to me. It seems like just a report about the retraction of the underlying study, and some information about the original reporting that they did for the story.
posted by sparklemotion at 10:50 AM on May 20, 2015


Goddamnit this sucks. I thought the idea seemed kind of hinky when I first heard it. Something about it didn't jibe with my experience, years and years of it, in political canvassing door-to-door, and in working with advocacy groups who would train people in using their experience as LGBT folk to talk to legislators.

But I didn't pay much attention to it because I was working on other issues.

But I know that LGBT advocacy groups DID pay a lot of attention to it, and therefore probably wasted their time and energy trying to implement the "research" findings into strategy.
posted by Cookiebastard at 10:51 AM on May 20, 2015 [1 favorite]


I work in a research compliance office. I don't do misconduct investigations myself, but we talk them over at staff meetings and such. We're pretty much in unanimous agreement with benzendream - it's not the people who get caught who ruin our sleep. It's wondering what else is going on out there that we don't ever hear about, either because people are much better at faking data (because come on, how would you not put outliers in your fake data?) or because of those nudgers.
posted by Stacey at 10:52 AM on May 20, 2015 [2 favorites]


You have taken out $150,000+ in non-dischargable loans to become a PhD.
I was going to repeat the typical "If you have to pay, they don't really want you there" advice, but a little research suggests that advice only applies to engineering and physical sciences grad students.

The social sciences seem to be even worse in this regard than the humanities, with a third of PhDs having accrued more than $30K in grad school debt alone.
posted by roystgnr at 11:04 AM on May 20, 2015 [2 favorites]


I'm trying to think of something where the price has gone up as much as it has for secondary education.

If you are talking about a price mismatch between the sunk costs of entry and expected returns, then a whole whack of stuff applies. Nearly all entrepreneurs in primary industries (agriculture, forestry, fisheries, mining) face increased capital and operating costs to get started and decreasing margins and end up getting eaten alive by big conglomerates.

Depending on whose sources you use, about half of small businesses are closed after four years. With a lot of them (particularly non-knowledge-based businesses) go a lot of savings, houses, credit histories, bankruptcies, marriages, etc. The return on those things isn't minimum wage - it can often be negative to one's net worth over time, particularly if you give up a salary to do it.

We don't see those things as quite the same as investments in education and the pitiful job market that Ph.D's face, but the cost to get in v. return of nearly everything that isn't working for a mega corporation, extraction of oil/shale/etc. (in that the cost to get in as an employee is very low), software engineering, or some specialized health careers is pretty dismal for everyone right now.
posted by buoys in the hood at 11:12 AM on May 20, 2015 [2 favorites]


Matt Oneiros: Not that Green should be held responsible for LaCour's malfeasance but it's surprising to me that someone can "co-author" a paper without having seen or participated in the data analysis.

It's not even safe to assume all of the co-authors have even read a paper. There's political inclusions, team members who aren't really involved in that particular project, authorities in the field tossed in to make the paper look better, etc.
posted by Mitrovarr at 11:37 AM on May 20, 2015 [2 favorites]




I was going to repeat the typical "If you have to pay, they don't really want you there" advice, but a little research suggests that advice only applies to engineering and physical sciences grad students.

It's mostly true in PhD programs in political science, though MA students usually pay.

It's not even safe to assume all of the co-authors have even read a paper.

This would not normally be true in political science; we don't have big labs where everyone's a coauthor. Most publications are one or two authors. In a circumstance like this, where this is LaCour's dissertation research and Green is on the committee, people would reasonably expect him to have made actual contributions to the article in addition to being familiar with the research as an adviser.
posted by ROU_Xenophobe at 11:44 AM on May 20, 2015


Also, please don't thread-sit your own thread.

Also, please don't try to backseat-mod a thread as a way of silencing people who disagree with you.
posted by en forme de poire at 11:58 AM on May 20, 2015 [7 favorites]


[Just as a general thing since it's sort of chaining along in here at this point: folks all around please consider contacting the mods about tossing in a mod directive in a thread if you feel like it needs one, instead of creating a situation where non-mods argue with other users and each other about mod-type commentary.]
posted by cortex (staff) at 12:04 PM on May 20, 2015


So it turns out authorship on an academic paper is sometimes something rather different than I thought!
posted by Matt Oneiros at 12:15 PM on May 20, 2015


The story on TAL gave me a lot of hope that the key to people thinking I am a human being with rights was just talking to them. Being personal and vulnerable and human. I'm really sad that seems not to be the case.

Actually, that's the worst part of this fakery - the conclusion may actually wind up being true, even if the data is faked. But no one is going to do this again because of the taint on it.
posted by corb at 12:41 PM on May 20, 2015 [3 favorites]


I do agree with ROU_Xenophobe that the motivations of people who commit brazen fabrications like this, frustratingly, seem to be more opaque than just being driven purely by stress and pressure. If you take a look at this article about Diederik Stapel (priming), career pressure doesn't even appear to enter into his description of deciding to fabricate data for the first time -- rather, it's frustration that the "real" data doesn't fit his own hypothesis:
The experiment — and others like it — didn’t give Stapel the desired results, he said. He had the choice of abandoning the work or redoing the experiment. But he had already spent a lot of time on the research and was convinced his hypothesis was valid. “I said — you know what, I am going to create the data set,” he told me.
Presented there, it seems more like a type of intellectual narcisissm -- presuming to "know better" than the actual data -- as opposed to feeling pressure from a supervisor or funding agency to get a particular result, as in the canonical "sharpie a mouse in the elevator" story.

In the case of Eric Poehlman (hormone replacement therapy), he claims initially to have been motivated by the responsibility of continuing to staff his research group, but also admits that what he really found fulfilling was the feeling of being a big shot:
Having all those people to pay, of course, was not just a burden; it was proof of his success. Poehlman wasn’t after the grant money for his own material benefit. He was seduced by a different kind of status. “Certainly there is this point of having a grant because it raises your esteem and raises your standing vis-à-vis your colleagues,” he told the court.
I don't doubt that stress and competitiveness play a part in fabrication and falsification, and again, I don't know whether people have looked at this really systematically (I'd love to read more about it if so) -- but I suspect that those stressors may be more likely to produce fraud (as opposed to, e.g., burn-out) when combined with a particular tendency towards grandiosity.
posted by en forme de poire at 12:43 PM on May 20, 2015 [4 favorites]


I know, personally, more than a handful of people who are doing follow up studies. Its not tainted.
posted by MisantropicPainforest at 12:43 PM on May 20, 2015 [5 favorites]


It's not even safe to assume all of the co-authors have even read a paper.

The only place I appear in the psych literature is as co-author. I didn't write anything for the paper, and I am still not sure that the premise of the paper is entirely coherent. I was just kind of frustrated at the whole project and ended up poking them with a lot of questions and I guess they found the process of having to answer the questions useful.
posted by Jpfed at 12:48 PM on May 20, 2015 [2 favorites]


There's political inclusions, team members who aren't really involved in that particular project, authorities in the field tossed in to make the paper look better, etc.

Which is, in and of itself, scientific fraud.
Conversely, research misconduct is not limited to NOT listing authorship, but also includes the conferring authorship on those that have not made substantial contributions to the research.[18][19] This is done by senior researchers who muscle their way onto the papers of inexperienced junior researchers[20] as well as others that stack authorship in an effort to guarantee publication. This is much harder to prove due to a lack of consistency in defining "authorship" or "substantial contribution".[21][22][23]
posted by Mental Wimp at 1:16 PM on May 20, 2015 [2 favorites]


Hurray for forensic data analysis and de-discovery!

Way back when (mid to late '70s) I took a clinical trials course and the lecturer described detecting fraudulent data confabulation at one center out of the twelve in the trial. Each center was frequently and thoroughly audited, but nothing in the monitoring reports indicated a problem. However, the biostatistician responsible for the study, who was the lecturer, frequently analyzed the accumulating data to monitor and assure quality. Being a good statistician, he not only looked at univariate summaries by center (and other subgroupings), but also looked at relationships between variables by center. He noticed that there were characteristic associations between values at all but one center, where every variable seemed to be uncorrelated. As the events were unwound, it turned out that this center had randomly generated their data and to cover their tracks had gone to the trouble of filling out forms using different pens, and even showing strikeout corrections occasionally with dates. This was discovered when the data coordinating center asked to see the names and addresses of the subjects and found that they didn't exist. Fakers are almost never clever enough to get by a competent statistician.
posted by Mental Wimp at 1:31 PM on May 20, 2015 [3 favorites]


Fakers are almost never clever enough to get by a competent statistician.

I'm not sure I'd go that far (what happens when your faker is a competent statistician?), but that's an awesome story, and careful data analysis definitely has the power to uncover at least some cases of blatant fraud. I've mentioned the Anil Potti case before on MeFi. Unfortunately, as grouse mentioned here, Baggerly & Coombes' damning analysis on its own wasn't enough to completely halt clinical trials based on Potti's fraudulent work, until Potti was found to have lied about being a Rhodes Scholar (!!) and the whole thing fell apart.
posted by en forme de poire at 1:38 PM on May 20, 2015 [1 favorite]


Mad scientists are always more fun in comic books.
posted by Galaxor Nebulon at 2:13 PM on May 20, 2015 [1 favorite]


In the case of Eric Poehlman (hormone replacement therapy),...

That's a remarkable story. He changed the data to conform to what everyone already believed. The irony is that the actual data refuted what everyone believed and had he pursued it it would have made him even more successful and famous. It's sad how some warping of personality can lead to self-destructive behavior.
posted by Mental Wimp at 2:27 PM on May 20, 2015 [5 favorites]


I'm sure the numerically competent fraudsters have already figured out that you should make sure your fake data follows Benford's Law.
posted by benzenedream at 2:29 PM on May 20, 2015 [2 favorites]


You are 24 years old. You have taken out $150,000+ in non-dischargable loans to become a PhD.

I was not aware of this…? In the American system, the top programs typically grant Ph.D. candidates a fellowship, generally meaning full tuition plus enough to subsist on during your toils. It's only master's programs that students are asked to pay for; the difference lies in the (sociopolitical) nature of the two degrees.
posted by polymodus at 2:36 PM on May 20, 2015


I was not aware of this…?

LaCour didn't take aout 150k to attend UCLA. I don't know where that misinformation ame from.
posted by MisantropicPainforest at 2:55 PM on May 20, 2015 [3 favorites]


"In the American system, the top programs typically grant Ph.D. candidates a fellowship, generally meaning full tuition plus en

ough to subsist on during your toils."


I agree that it's unlikely this guy took out a large amount of loans but true fully funding fellowships are far from the common standard in US top PhD programs, at least outside the hard sciences & engineering . It's much more likely that you get teaching & research assistantship jobs which may or may not be guaranteed for the expected term of your PhD program (a friend of mine completed her PhD (early not late) in a top-ranked arts program in the same institution as this guy and beyond the first couple of years she had to apply competively for a limited pool ofassistant jobs or 1 year fellowships; it wasn't guaranteed even though she was a star student ). Sometimes these assistantship jobs are called "fellowships" anyway but that's just misleading marketing .
posted by Bwithh at 3:19 PM on May 20, 2015


Mental Wimp, totally! One of the most tragic and ironic parts of that story.
posted by en forme de poire at 3:28 PM on May 20, 2015


I don't know where that misinformation ame from.

That data was invented to support the commenter's hypothesis.
posted by ryanrs at 3:56 PM on May 20, 2015 [7 favorites]


That data was invented nudged to support the commenter's hypothesis.
posted by maxwelton at 4:05 PM on May 20, 2015 [1 favorite]






I am surprised a Ph.D. student could come up with the funds to pay the subjects for this study. Based on the supplementary materials for the Science paper, there were "9,507 survey panelists who completed a baseline survey":
In order to encourage participation in the baseline survey, respondents were paid $10 upon initial enrollment. In an effort to impanel multiple voters per household, individuals were offered $2 (per referral) to refer their friends and family to participate in the survey panel. In order to encourage participation in follow up surveys, respondents were paid $5 per follow-up survey.
My interpretation of how subjects were compensated is that 9,507 people were paid $10 each, so that's $95,070 just for the people who completed the baseline survey. Now, there were 7 survey waves, for which people were paid $5 each. The number of respondents for each of the 7 waves were respectively 11948, 10597, 10764, 10843, 8339, 9013, and 8088. So that's 69,592 payments of $5 each, or $347,960.

Plus the survey firm that apparently handled subject recruitment and administered the surveys would also have to have been paid.

Unless my interpretation of how subjects were paid is incorrect, we're talking a budget for the study that even established professors would not find easy to fund through major grants. I would think this would have generated some concern from his dissertation committee as to how he planned to fund the study, and then how he managed to fund it.
posted by needled at 5:03 PM on May 20, 2015 [7 favorites]


Needled, one of the links above (can't remember which) said that it turned out he hadn't even spent the grant he had received to pay people. So it sounds like he did get grant funding. And 95,000 is not even a major grant, just a bit higher than most small grants. Most major grants are multiple hundreds of thousands.

Probably you end up needing an independent panel like an IRB to balance the needs of privacy and transparency. Not sure how far publications or institutions have gone with that idea so far.

That's not what the IRB is or does, though—the IRB is to keep the university from getting sued, nothing more, and certainly not about research methods, design or replication
.

I would disagree with this. At least it must vary from university to university. In my experience IRBs are absolutely concerned with privacy. (Probably because of the concern about getting sued, of course). But they frequently ask questions about the bits of the reach methods or design that might cause privacy issues. A common way to push back against unreasonable IRB issues relating to this is to argue that your data need to be transparent (due to requirements of journals, funding bodies, or just academic integrity), so I don't find it implausible that universities could extend the mandate of IRBs to care about balancing these two things right from the start. Especially if they frame it as protecting the university against fraud.
posted by lollusc at 6:25 PM on May 20, 2015 [3 favorites]


I am surprised a Ph.D. student could come up with the funds to pay the subjects for this study

This was my shock. PIs, advisors, and senior collaborators might not know the ins and outs of the research methods, but goddamn do they know which grant is paying for what.
posted by supercres at 6:26 PM on May 20, 2015 [7 favorites]


On CBC's As It Happens tonight Donald Green addressed the funding aspect. Apparently he lied about some big external funder.

I know co-authoring doesn't mean co-researching and co-writing, but putting your name to something you didn't probe enough to realize it was completely made-up? I don't think it's terrible that Green is getting a little blowback on this. I mean, if it had been true, he'd be sharing the glory though his participation would not have been any more significant.
posted by looli at 6:26 PM on May 20, 2015 [1 favorite]


Oh wait, I see the rest of the calculation adds up to 400k+, sorry, yes, that is unrealistically high for a phd student study. Weird. There must be some mistake in terms of how his payment of participants is reported.

As for the comments above about phd student loans, I think it is common even for fully funded students to go into debt, because "full funding" often equates to a stipend that barely covers minimum-wage type expenses, and assumes you are living in a shared house, eating nothing but ramen. Many grad students, however, have moved past that stage of their lives, and may even have family to support. So they end up in quite significant debt.
posted by lollusc at 6:29 PM on May 20, 2015 [1 favorite]


"This makes me so sad. The story on TAL gave me a lot of hope that the key to people thinking I am a human being with rights was just talking to them. Being personal and vulnerable and human. I'm really sad that seems not to be the case."

So, one of my friends sent me the Buzzfeed story; I don't know how I missed it here.

I was one of the canvassers who worked on the face-to-face and over the phone data collection for the follow-up, and over the course of a couple years I had probably a good 10,000 conversations all using this model.

What I can say is that it's based on some very solid premises and a big body of anecdotal evidence that strongly implies that it's the best model for the goal of changing people's minds. It's rooted in a lot of psychology of persuasion stuff that's used across politics, and really the biggest problem with using this model is that it's hugely expensive compared to things like phones or direct mail or digital outreach. If I recall correctly, the breakdown was that the equivalent of one conversation — about 20 minutes, usually — cost over $50 in organizational time, including training, transit, staff time, etc. For phones, it was something like $10, and other modes were even cheaper. So it's a good hunch, but part of why this was exciting was that finally a lot of non-profits came through with the cash to actually fund a good study, and it seemed like it worked. I was pretty pleased to see the results even though from looking at his data, ours had been excluded. In a weird way, that made me more confident in his results.

(Our data, and the data from our coalition, was just amazingly, embarrassingly terribly collected, relying on an elaborate, poorly-designed flowchart questionnaire that interviewers were supposed to fill out as they were talking, including full sentences of what the interviewee's exact phrases were regarding marriage. Doing this on a clipboard while also carrying on a personal conversation is incredibly difficult for trained journalists, let alone volunteers and canvassers. I know that people involved gave their best effort, but still there were so many fuzzy judgment calls on every single entry, and so many where I know the data was just flat out guessed at the end of a shift, that I felt it was a big ethical concern for my former employer and felt really frustrated when no one else took it seriously at all — for a lot of people in the coalition, it was something that justified fundraising, not a project that fundraising supported. Especially in how that data got reported back to funding agencies, I was worried it was tantamount to fraud, and those concerns were dismissed because LGBTQ orgs were desperate for funding.)

I can say that this does knock the wind out of a lot of nifty stats — and ugh, I was just in an interview for a new job on Monday and mentioned something that I did feel confident about, that we'd done message testing and refinement as part of that study, and that definitely improved results on our end — but my belief is still that the underlying model is generally sound with some serious caveats (training really matters; emphasizing the subject's relationships seems to work better than grounding things in canvasser relationships) but is just prohibitively expensive to test in any real way.

(Ironically, at one of the non-profits I'm consulting for right now, a lot of their premises for programs they've been running are also based on data — in this case, things like teacher retention at LAUSD and student performance — that are also wildly outside of what is possible to collect. I always feel like the jerk in the room when I'm like, "How, exactly, would we measure that? What data sources would we use? How would we collect that?" and everyone else is just assuming that because it's aimed at helping students it probably works.)
posted by klangklangston at 6:38 PM on May 20, 2015 [24 favorites]


What I find amazing is that since the followup canvassing by the hundreds of volunteers organized by the Leadership LAB at the Los Angeles LGBT Center wasn't faked, LaCour had to have somehow come up with a fake list of actual people for them to interview who he claimed to have participated in his online survey. Even Green commented that given the amount of effort involved in generating the fake survey data and other associated work product LaCour could of gotten a real online survey done.
posted by RichardP at 7:09 PM on May 20, 2015 [2 favorites]


So I get that this is entirely the grad student's doing, but for everyone worrying about smaller-scale problems in academia -- the nudges, the blind-eyes, etc -- that are much more common in turning a poor result into a positive one -- why is the co-author getting off scott-free in all this?

Denouncing it quickly and thoroughly doesn't mean you are free of culpability, and putting your name on a paper means you vouch for it, at the very least. It means you have seen the data and the calculations, and participated in the analysis. Maybe in something with dozens of compartmentalized authors each one is fairly separate from the others, but with two authors it's very different. And when one of those is already famous, the fact that he immediately turns on the other and claims to know nothing about it is, though probably true, not a little bit problematic. The fact that he truly just accepted all of this sight-unseen and slapped his name on the paper -- if that is his assertion -- seems like the sort of endemic problem that many people claim is the more serious issue for academia. As others have said: did the mentor claim any credit for the study? If so, then he should be taking some of the flack for the outcome, because it is precisely those plaudits that incentivized him to ignore the data provenance.
posted by chortly at 10:03 PM on May 20, 2015 [3 favorites]


"What I find amazing is that since the followup canvassing by the hundreds of volunteers organized by the Leadership LAB at the Los Angeles LGBT Center wasn't faked, LaCour had to have somehow come up with a fake list of actual people for them to interview who he claimed to have participated in his online survey. Even Green commented that given the amount of effort involved in generating the fake survey data and other associated work product LaCour could of gotten a real online survey done."

Well, since we're beyond the time when it would matter, my understanding was that we were getting our lists from VAN, which was dubious because we were getting c3 money to do it. I don't know where the Center was getting theirs. So if he was faking post facto, he'd probably just get VAN access through one of the c4/PAC folks and invent the data from that.
posted by klangklangston at 10:30 PM on May 20, 2015


It means you have seen the data and the calculations, and participated in the analysis.

In fairness, though, I think it's likely that he did see a version of the data and the calculations and participate in the analysis -- it's just that the data themselves were totally made up. Generally people assume their collaborators are acting in good faith, and review and guidance that would be totally adequate in that scenario can fail hard if someone is actively trying to deceive you (this is related to why peer review is usually not sufficient to detect fraud). They were also at different institutions, which would have made direct oversight even more difficult (though I agree the mysterious "external funder" thing is a strange red flag). Anyway, I can see arguing that maybe greater precautions need to be taken so that data fabrication can be detected earlier, and authorship shenanigans are a real problem, but without further evidence to the contrary I think the senior author is more likely to be a victim of fraud than an accomplice.
posted by en forme de poire at 10:56 PM on May 20, 2015 [7 favorites]


roystgnr: "I was going to repeat the typical "If you have to pay, they don't really want you there" advice, but a little research suggests that advice only applies to engineering and physical sciences grad students."
It is always worth repeating when the question comes up and doesn't just apply to engineering and the physical sciences, almost especially because of how much worse the stats have gotten.

Things are not generally so bad in the hard sciences because even mediocre departments need to compete for a limited supply of the actually decent much less really good graduate students, and there are multiple products that good STEM students can produce that people are willing to pay for. However, there are still indeed plenty of specific professors within otherwise nice departments, specific departments within otherwise nice fields, and indeed entire fields that are impossible for a graduate student to negotiate anything not deeply shitty either because what they produce in that context is so de-valued and/or replaceable that they have no leverage or the employer are so deeply unhealthy as to be incapable of acting in their own best interest by shaping up in such a way as to attract valuable students. This is why prospective students should NEVER EVER do an unpaid post-graduate academic (non-professional) degree, much less pay for one with your own money.

Both in the sciences and elsewhere, one of the big transitions from undergraduate life to graduate school is one that no one really warns you about, where as an undergrad your success is the end goal of most everyone around you with power over you, while as a graduate student you are almost always simply a means to some other end. It is a tricky new dynamic that you suddenly need to negotiate the moment you start interviewing, where for the department you will be a means of cheaply supporting professors who bring in cash or a means of cheaply instructing students who bring in cash. While for professors you could be a means of establishing pecking order in the department by supervising your teaching, a means of cheaply producing research with tools that are committed to sticking around for a while, a means of expanding their research community, or generally all of the above; what you aren't is the customer like you were in undergrad, you are the product being sold by you. This is a very different dynamic and you have to act like it to protect your interests because you can't rely on anyone else to for you.

To that end, any letter that you get from an institution offering you a chance at an post-graduate academic degree but not enough funding for both tuition and a plausibly livable stipend, is not an acceptance letter, it is an advertisement, and the product will be shitty. An advanced academic degree that you pay for will, in addition to driving you into debt that the degree will not help you pay off, make you an exploited stooge, and just like everywhere else, no one respects an exploited stooge in academia. An adviser who is desperate enough to take their failure to thrive and failure to fund their work out of the asses of their graduate students is an adviser who cannot be expected to give a sufficient shit about you to be worth your while; and a department that is craven enough to do the same also does not give a sufficient shit about you to be reasonably expected to further your interests. Similarly, any academic field without sufficient funding to do something as fucking basic as paying its graduate students a livable wage for their labor in the form of either teaching or research is not a field worth joining for anyone but the independently wealthy and hobby minded. Not only is an advanced academic degree without funding is a miserable existence, but it will also inevitably not result in the reward of a career that academia as an institution is designed to provide, it will give you an academic hobby. Not all academic degrees are created equal and an adviser/department/field that cannot get their shit together enough to pay you will be an adviser/department/field that cannot be taken seriously by the people you would want to pay you in a career. That is an adviser/department/field that cannot be reasonably expected to train you in an economically viable skill set, much less help you prepare for a career more successful than their own.

Also, before some doe-eyed undergrad stops by to extol the virtues of sacrificing for what you believe in, joining an academic field under exploitative conditions will only ever hurt it in profound and generationally deep ways. Inevitably, the most important thing you as a voluntarily exploited graduate student would accomplish for the study of whatever would be to push it further towards being dominated exclusively by those with more money than sense rather than those with genuine merit. Whether one has more money or less sense, the sacrifices that should be made for academic fields are ones that must be made by those with the ability to make meaningful and beneficial ones, like universities, taxpayers, funding agencies and the independently wealthy - not vulnerable students. As a prospective student you only really have the power inherent in what you are willing to consent to, and that power is considerable. It helps no one for you to use it to enable the exploitation of the vulnerable.

TL;DR: NEVER EVER do an unpaid post-graduate academic (non-professional) degree, much less pay for one with your own money. You will only hurt yourself and everyone around you.
posted by Blasdelb at 1:38 AM on May 21, 2015 [21 favorites]


Denouncing it quickly and thoroughly doesn't mean you are free of culpability,

What should happen to Don Green?
posted by MisantropicPainforest at 4:18 AM on May 21, 2015


In fairness, though, I think it's likely that he did see a version of the data and the calculations and participate in the analysis -- it's just that the data themselves were totally made up.

Good point. I've worked on studies as an interviewer and study coordinator where none of the authors never saw the raw data, but only the numbers after they were entered into the database. There was a lot of trust given to the research support staff.
posted by jb at 4:20 AM on May 21, 2015 [1 favorite]


TL;DR: NEVER EVER do an unpaid post-graduate academic (non-professional) degree, much less pay for one with your own money. You will only hurt yourself and everyone around you.

And that goes triple for the humanities - because there are no industry/government fallbacks. It's hard enough to succeed with good financial support; trying to do it on loans is crazy.
posted by jb at 4:25 AM on May 21, 2015 [1 favorite]


klangklangston: Well, since we're beyond the time when it would matter, my understanding was that we were getting our lists from VAN, which was dubious because we were getting c3 money to do it.

c3 groups use the VAN all the time. There's no dubiousness there. They are usually using a version with different information in it than c4 groups, though.
posted by lunasol at 4:30 AM on May 21, 2015


What should happen to Don Green?

It's vanishingly unlikely that he'll see any consequence at all as a coauthor unless LaCour asserts that Green was in on it.

However, I expect that Vavreck as chair, the rest of the committee including Green, and the UCLA department will take something of an informal reputational hit for what appears at present to be mind-bogglingly lax oversight of LaCour's dissertation. It just shouldn't be possible to fake several hundred thousand dollars of external funding, if nothing else.
posted by ROU_Xenophobe at 7:43 AM on May 21, 2015 [1 favorite]


TL;DR: NEVER EVER do an unpaid post-graduate academic (non-professional) degree, much less pay for one with your own money.

The most important exception ([lo pan] there are always exceptions, are there not? [/lo pan]) is that there are many career paths that reward an MA in something, and paying in-state state-U prices for that MA can be entirely reasonable. Though at that point the academic degree is acting MOL as a professional degree.
posted by ROU_Xenophobe at 7:49 AM on May 21, 2015 [2 favorites]


Also, I know more than a handful of people who didn't do that great as an undergrad, but did well enough to get into a decent MA program, did well there, paid for it, and are now at top-10 PhD programs. It happens quite frequently, especially if you aren't from the upper crust and went to some middling state school for undergrad and didn't really know what a PhD was. I don't see the necessitity of making the 'don't pay' advice categorical.
posted by MisantropicPainforest at 7:59 AM on May 21, 2015


Yup, and being an old I taught classes to a few people doing MAs with just that logic who are now tenured at Flagship States.

But I expect that group of people is much smaller than the collection of teachers and other government employees who can professionally benefit from an MA, if only because that pool has to be a proper subset of the pool of people seriously interested in being academics.
posted by ROU_Xenophobe at 9:19 AM on May 21, 2015


From the CBC interview with Green linked above:

"Green is expecting the study retraction to have a negative effect on his own reputation as an acaemic researcher.

"It's certainly going to be with me until I reach my grave," he says. "I think this will always be something that people will say about my research for good or ill. My hope is that something positive can come out of this, mainly that I can use this experience to think about, and perhaps help others think about, ways of preventing this sort of thing from happening again. What kinds of procedures can we put in place to prevent the huge waste of resources that occurred as a result of this fabrication?"

He also thinks that if LaCour is unable to produce data to support his research, that he will face severe career consequences.

"I expect that he will be subject to an academic investigation. He has not yet received his Ph.D; I think that it's unlikely that he will receive it. I think he will probably not, in the end, take the job that he was offered at Princeton. I'm guessing that as the process unfolds, he's likely to have that offer rescinded."


I wonder how long Princeton will wait before rescinding the offer. I also wonder whether they will do so publicly (unless doing that would violate laws of some kind).
posted by a fiendish thingy at 9:22 AM on May 21, 2015 [2 favorites]


Also, I know more than a handful of people who didn't do that great as an undergrad, but did well enough to get into a decent MA program, did well there, paid for it, and are now at top-10 PhD programs. It happens quite frequently, especially if you aren't from the upper crust and went to some middling state school for undergrad and didn't really know what a PhD was. I don't see the necessitity of making the 'don't pay' advice categorical.


Yeah. The "don't pay" dictum (and its close relative, the "don't do a PhD outside Oxbridge or the Ivy League" proviso), like so much else in Anglo-American academia (and elite culture in general) is inherently Calvinistic: a secular version of the Puritan concept of election. "Good things will happen to you if you're one of the Elect! If you don't get a good thing (a free ride scholarship, say) it's a Sign and you're not one of the Elect. It simply wasn't destined to be!" As well intentioned as a lot of this advice is, it simply reinforces existing class boundaries. Because those signs of election don't just manifest out of the ether. They reflect patterns of social class, family connections, and network advantages available only to the upper and upper middle classes. Taking the "don't pay" dictum to its logical conclusion would essentially cleanse academia of anyone outside this tiny, overwhelmingly homogeneous social elite. Which is, perhaps, the point.
posted by Sonny Jim at 10:07 AM on May 21, 2015 [4 favorites]


The "don't pay" dictum (and its close relative, the "don't do a PhD outside Oxbridge or the Ivy League" proviso)

As one of the many many people who were paid to dissertate at a state school, my reaction to this is ``what?''
posted by yeolcoatl at 10:59 AM on May 21, 2015 [2 favorites]


It's still a good principle, it's just that there are exceptions.

It's not some sinister ploy to class boundaries or keep the hoi-polloi out of the academy. It's the simple truth that in many disciplines, a grad student at a bottom-quartile PhD program who has to work a lot to make ends meet is going to be radically outcompeted by students from top-quartile/top-decile departments who have more impressive letters, more time and energy to devote to All Things Academic because they're on a fellowship or TA-ship, more resources to do things with, more money to go to conferences, and all sorts of things that end up with them having much shinier projects. My sense is that in some disciplines (English, history?) this extends outside of the research-university world to hires at BA-granting departments and clearly instructionally-focused departments.

Or, if you'd rather, in some disciplines the production of new PhDs so vastly outstrips the availability of tenure-track jobs that spending money to get a PhD from a lower-ranked department is unwise.

FWIW, political science is not really like that, except for political philosophy.
posted by ROU_Xenophobe at 11:12 AM on May 21, 2015 [3 favorites]


"c3 groups use the VAN all the time. There's no dubiousness there. They are usually using a version with different information in it than c4 groups, though."

Really? I've only ever seen it used as a c4/PAC tool — my understanding was that targeting outreach based on voting records for a project that was likely to have electoral influence (since that was part of the point of doing the persuasion) would have to be c4; in any event, our lists were pulled based on Prop. 8 turnout and precinct returns, and while the nominal coding of the project was "educational" to keep it mostly within c3 auspices, I tended to think of that as an unethical fig leaf based on funding requirements that some foundations gave us. It was effective work (or so we thought; who knows?) and in the service of a worthwhile aim, but some aspects seemed like ends-justify-means thinking.
posted by klangklangston at 12:57 PM on May 21, 2015


I'm seeing some claims online that LaCour allegedly claimed having $800,000 in grant funding. The mind boggles.
posted by needled at 5:24 PM on May 21, 2015


Yeah. The "don't pay" dictum (and its close relative, the "don't do a PhD outside Oxbridge or the Ivy League" proviso), like so much else in Anglo-American academia (and elite culture in general) is inherently Calvinistic.

Or maybe it's just practical advice?

If someone asked me whether or not it'd make sense to attend University of Massachusetts School of Law at full, non-resident tuition, I'd look at the costs and the statistical outcomes. For costs, tuition for three years is going to be about $150,000, not including the interest that's going to compound during your attendance. For outcomes, less than half the class passes the bar, and less than a third get employed into a job that requires a JD.

Am I being an elitist, chauvinist pig who wants to keep the poor oppressed by advising this student to not attend? You tell me.
posted by Dalby at 6:27 PM on May 21, 2015 [1 favorite]


As well intentioned as a lot of this advice is, it simply reinforces existing class boundaries. Because those signs of election don't just manifest out of the ether. They reflect patterns of social class, family connections, and network advantages available only to the upper and upper middle classes

Far from being elitist, it's poorer students who really can't afford to pay for grad school. I could barely afford my residence deposit; if I hadn't had a full-ride for grad studies, I couldn't have gone.

Really wealthy people are the ones who can actually afford to pay $$$ for an MA that doesn't lead to an in demand field.

I didn't get a good job from my grad studies. It sucks, but at least I left grad school with zero debt.
posted by jb at 9:27 PM on May 21, 2015




Really wealthy people are the ones who can actually afford to pay $$$ for an MA that doesn't lead to an in demand field.

I'd be willing to put real money down betting that on average wealthy people pay less for graduate degrees, because of social capital and knowing how to apply for fellowships and other aid or having jobs that provide tuition reimbursement; because wealth goes along with high scores on standardized tests which determine admissions and aid; and because wealthier students apply to higher ranked schools that have more money for aid. Whereas a schoolteacher who needs a masters in order to move up the pay scale is likely going to self-fund it from the local state system, for example, rather than apply to multiple Ivy League schools.

It can pencil out to self-fund a practical masters in a lot of situations; the advice about not going to graduate school without funding, however, is rock-solid when it comes to PhD programs. Academic careers don't pay enough to make it worth taking on huge loans and the academic track is risky. Most importantly, a department providing funding is a signal of seriousness and an investment on their part in the student's success.
posted by Dip Flash at 5:41 AM on May 22, 2015


LaCour has updated his statement:
I will supply a definitive response on or before May 29, 2015. I appreciate your patience, as I gather evidence and relevant information, since I only became aware of the allegations about my work on the evening of May 19, 2015, when the not peer-reviewed comments in "Irregularities in LaCour (2014)," were posted publicly online.

I must note, however, that despite what many have printed, Science has not published a retraction of my article with Professor Green. I sent a statement to Science Editor McNutt this evening, providing information as to why I stand by the findings in LaCour & Green (2014). I've requested that if Science editor McNutt publishes Professor's Green's retraction request, she publish my statement with it.
> I'm seeing some claims online that LaCour allegedly claimed having $800,000 in grant funding. The mind boggles.

In his most recent CV [his website; UCLA website; Archive], he claimed to be PI(!) on 11(!!) grants totaling $793,000(!!!). I'd say that I don't know why Princeton wasn't skeptical of what seems like an obviously embellished CV (he would have gotten those grants before his first publication; who would fund a PI with no history?), but at this point I'm not even sure the Princeton job offer was real....

In a sense, though, it's almost better if the grants are fake than if he embezzled real ones.
posted by Westringia F. at 7:10 AM on May 22, 2015 [2 favorites]


I picked one of the grants on his CV at random, for the William and Flora Hewlett Foundation, and searched their grants database. No information about him being awarded anything per the claims on his CV.
posted by entropone at 7:28 AM on May 22, 2015


> but at this point I'm not even sure the Princeton job offer was real...

I was wrong to sow [more] doubt. The job offer was real.
posted by Westringia F. at 8:04 AM on May 22, 2015


But the job offer is probably no more, if there were falsifications in the CV, as seems to be the case with the grants.
posted by needled at 8:06 AM on May 22, 2015


And that LaCour won't be receiving a PhD at all anymore.
posted by MisantropicPainforest at 8:09 AM on May 22, 2015 [1 favorite]


Oh, definitely. I was just hoping that it was a delusion rather than him setting off a cascade of mis-hires by fraudulently getting the Princeton slot.
posted by Westringia F. at 8:10 AM on May 22, 2015 [1 favorite]


I checked the birth register for Denton, Texas for 1986 and there's no record of a Michael James LaCour having been born there.
posted by Flashman at 8:16 AM on May 22, 2015


Is this a joke?
posted by MisantropicPainforest at 8:18 AM on May 22, 2015


Maybe Metafilter has mysteriously merged with Political Science Rumors ...
posted by needled at 8:21 AM on May 22, 2015


Maybe leave the digging to the academics committee? Why are we looking for his birth records?
posted by Think_Long at 8:29 AM on May 22, 2015


What makes field organizers effective? Being like the people they want to persuade.

In a slightly different realm, we've discovered the same thing in recruiting and interviewing health studies research projects. The subtlety I didn't recognize at first is that it's less important that they be from the same race/ethnicity/religion, but more important that they be from the same socio-economic stratum. For example, in a study of inner-city public grade-school kids, Somali immigrants, who tend to be educated and disproportionately from the professional class, made great recruiters/interviewers for other Somali families, but did very badly among the disadvantaged African-American families. On the other hand, a white non-hispanic interviewer and a white hispanic interviewer who both came from economically disadvantaged backgrounds did very well with that latter group (as well as did the recruiter/interviewers from that group). We learned this the hard way, but eventually got everyone lined up properly and were able to get about 60% participation from a probability sample of families.
posted by Mental Wimp at 1:04 PM on May 22, 2015 [5 favorites]


I'm not sure I'd go that far (what happens when your faker is a competent statistician?)

We only use our power for good. Heh.
posted by Mental Wimp at 1:12 PM on May 22, 2015 [2 favorites]


Maybe Metafilter has mysteriously merged with Political Science Rumors ...

Damn you, needled. There's like 20 hours of reading in there.
posted by mr_roboto at 1:30 PM on May 22, 2015


Believe me, you do not want to spend 20 hours on PSR. Wretched hive of scum and villainy, etc.
posted by ROU_Xenophobe at 1:41 PM on May 22, 2015


> Wretched hive of scum and villainy, etc.

In the red corner, Economics Job Market Rumors! And in the blue corner, Political Science Rumors!

Let's get ready to rumble!
posted by needled at 2:27 PM on May 22, 2015 [1 favorite]


NYT reports on the debacle, including quotes from Science editor-in-chief Dr. Marcia McNutt:
“Given the negative publicity that has now surrounded this paper and the concerns that have been raised about its irreproducibility, I think it would be in Michael LaCour’s best interest to agree to a retraction of the paper as swiftly as possible,” she said in an interview on Friday. “Right now he’s going to have such a black cloud over his head that it’s going to haunt him for the rest of his days.”

...

In a letter that he sent through his lawyer, Dr. McNutt said, Mr. LaCour said he had instead allowed participants the chance to win an iPad, saying “that was incentive enough.” Dr. McNutt said the supposed payments had convinced the reviewers that the response rate was as high as the study reported.

Dr. Green asked Mr. LaCour for the raw data after the study came under fire. Mr. LaCour said in the letter to Dr. McNutt that he erased the raw data months ago, “to protect those who answered the survey,” Dr. McNutt said.
posted by Westringia F. at 9:24 AM on May 26, 2015 [2 favorites]


Wow. The chance to win an iPad is completely different from the survey payment procedure as reported in the published Science paper. It just boggles me that this new explanation of survey participation incentives is considered any kind of defense or justification for the Science paper.
posted by needled at 10:13 AM on May 26, 2015 [3 favorites]


And LaCour made up the funding sources on his CV.
posted by needled at 11:49 AM on May 26, 2015 [2 favorites]


Amazing how far LaCour is willing to take his lie. He's not just a desperate grad student who cheated and got caught, he's a full blown sociopath.
posted by Mental Wimp at 12:10 PM on May 26, 2015 [2 favorites]


It's not so surprising, is it? What a tangled web we weave, and so forth? Surely whatever desperation motivated his initial deception is orders of magnitude worse now; I'm not surprised we're hearing unbelievable excuses. But good grief, when the editor-in-chief of Science is literally telling the New York Times that it's time to cut your losses... one does wish he'd listen.

And of course this excuse still does nothing to explain the gross statistical anomalies BKA identified. The response rate was far from the only -- or even the most damning -- issue; there's still nothing to explain why his cleaned, LA-based data looked more like a subset of the national CCAP dataset than even the California CCAP data, nothing to explain why the "heaping" patterns on the feeling thermometer blur over time in a surprisingly Gaussian way and pile up at the ends, nothing to make his irreproducible results look more believable.

It's tragic, really.
posted by Westringia F. at 5:39 PM on May 26, 2015 [1 favorite]


It's not so surprising, is it?

It is. He's clearly been busted, but acts as though, if he just spins the right bullshit, everything will come out fine.
posted by Mental Wimp at 7:22 PM on May 26, 2015


Very detailed story on how David Broockman uncovered the data fraud, including how he was originally discouraged by more senior academics on pursuing it further.
posted by needled at 9:56 AM on May 29, 2015 [9 favorites]


I'm eagerly waiting to see if LaCour will respond today, as per his statement.
posted by entropone at 1:41 PM on May 29, 2015


I've been actually worried he'll kill himself! He's painted himself into a corner. I hope he leaves town and sees a really gifted therapist.
posted by latkes at 2:13 PM on May 29, 2015


His response is now up on his personal webpage. Both it and his CV are linked to DropBox, presumably so that Google caches can't be stored as he changes the content over time. His response is a mess. He acknowledges making up the grants, but says that one of them he was offered but turned down because the study was already completed. He plays down fabricating his funding and therefore his methods as though it's not any big deal to lie about your methods in a Science article. He talks a little about how he allegedly instead raffled off iPads rather than paying people, and provides, of all things, credit card receipts showing he purchased iPads.

His only possibly reasonable contention is that he couldn't show anyone the raw data as that would be a violation of the IRB. I think it's unlikely that the IRB would prevent him from showing his co-author the data, but it is true that he couldn't show it to Broockman. He insists he couldn't show Green the raw data, but he also says that Green signed off on seeing the raw data as part of the Science publication. In any case, he says destroying the raw data was a requirement of the IRB, which is likely. He also claims that he did use the survey company listed in the document, but he used a unique registration name and email address for that study which is different from the account everyone was looking at. He makes no mention of the fictional employee he claimed to work with.

The bulk of the document, however, is an attack on Broockman., accusing him of lying about his timeline and of poor statistical methods. He obviously really loves R because there is a lot of R code. That part reflects pretty poorly on LaCour because he really spends way too much time being personal and not nearly enough time responding to the substantive critique by Broockman and Kalla.
posted by hydropsyche at 5:07 AM on May 30, 2015 [1 favorite]


In any case, he says destroying the raw data was a requirement of the IRB, which is likely.

Nope, and stuff in his own documents shows that it isn't. UCLA IRB requires either (1) anonymously collecting raw data or (2) destroying and removing identifiers, not the entire data set.
posted by damayanti at 6:15 AM on May 30, 2015 [3 favorites]


Yup. As an IRB member, when we tell someone to destroy identifiers, that is distinct from "destroy the data itself" and doing so would be a violation of the IRB protocol. If the data were so sensitive that that would not be good enough, the next step is generally "apply for a Certificate of Confidentiality" or "don't collect identifying data in the first place," not "delete all the data." I can't speak to the stats, but his IRB handwaving is super fishy.
posted by Stacey at 7:43 AM on May 30, 2015 [4 favorites]


In following coverage of this this morning, it is becoming clear that LaCour is reading criticisms of his document and editing in real time. Which is going to make this increasingly hard to talk about.
posted by hydropsyche at 8:23 AM on May 30, 2015 [1 favorite]


His response is now up on his personal webpage.

Well, he's certainly helping to make my earlier statement prophetic.
posted by Mental Wimp at 10:32 AM on May 30, 2015


He also claims that he did use the survey company listed in the document, but he used a unique registration name and email address for that study which is different from the account everyone was looking at. He makes no mention of the fictional employee he claimed to work with.

Now the NY Times is reporting that he told them it was a different survey firm altogether (as yet unnamed.)
posted by Jahaza at 1:35 PM on May 30, 2015


Well, I read the document 6 hours ago, so that may have changed by now. Not using the survey company that he reported in the paper he used, or that he told his coauthor he used, or that he told Broockman that he used, is just another minor thing that people are blowing out of proportion, no doubt.
posted by hydropsyche at 1:37 PM on May 30, 2015


Yeah, it's painful to watch him try to dig himself out of this hole. Jesse Singal has good reporting on it: in this brief bit about Lacour faking a teaching award you actually see his fraud unfold in real time.
posted by entropone at 4:37 PM on May 30, 2015 [1 favorite]


This is weirdly engrossing. Is the email from Green as damning as some people seem to think it is? (This record was offered up in LaCour's response, but the original Dropbox link is now broken.)
posted by threeants at 6:04 PM on May 30, 2015


"Now the NY Times is reporting that he told them it was a different survey firm altogether (as yet unnamed.)"

"I only dealt with the principals at Blair, Cook and Glass!"
posted by klangklangston at 6:48 PM on May 30, 2015


This is weirdly engrossing. Is the email from Green as damning as some people seem to think it is? (This record was offered up in LaCour's response, but the original Dropbox link is now broken.)

One would assume it is not, or LaCour would have left it up.
posted by hydropsyche at 4:43 AM on May 31, 2015


This is weirdly engrossing. Is the email from Green as damning as some people seem to think it is? (This record was offered up in LaCour's response, but the original Dropbox link is now broken.)

One would assume it is not, or LaCour would have left it up.


Yes, this seems like misdirection. I wonder how many posters in the PSR thread are LaCour?
posted by grobstein at 6:01 AM on May 31, 2015 [1 favorite]


Email here? But I don't really understand where the unethical part is?
posted by maryr at 5:44 PM on May 31, 2015


Reading the Irregularities in LaCour and Green together with LaCour's Response to Irregularities in LaCour and Green (2014) has been a pretty amazing thing for me with my passable fluency in R. Its like watching a devastatingly beautiful Tolkien-esque wizard battle, only I speak the language that these brilliant Maiar are muttering curses in and can see the ephemeral and subtle but deeply cutting wounds. While honestly its very clear that the only case LaCour can make in his R rant is that his fraud was more sophisticated than Broockman, Kalla, and Aronow portray - he does a shockingly effective job in a lot of it. The asshole is clearly intelligent, motivated, and deeply deeply bitter about having been so effectively caught.

Last week I went to a seminar on Academic Integrity hosted by my university for young researchers after a number of bad experiences, which had a workshop that might as well have been entitled: "WHY YOU MUST ALWAYS RESPECT AND FEAR THE ATTENTIONS OF A COMPETENT AND MOTIVATED STATISTICIAN," and it was very convincing. Just like with here, if anyone is presenting enough data to be both exciting and statistically convincing, they are necessarily also presenting enough data to make any fraud entirely clear through some means they will not have thought of.
posted by Blasdelb at 5:24 AM on June 1, 2015 [5 favorites]


The most dismaying part of this is that had LeCour put in the the amount of work he is currently putting in to try to salvage himself from his fraud into a proper study, it would probably produce a great set of papers and help move the field along, to the extent that he would be as renown for doing something good as he will be infamous for doing something bad. It's so sad to see someone with so much drive and intelligence throw it away like this. We need every good mind pulling at the oars.
posted by Mental Wimp at 8:29 AM on June 1, 2015 [2 favorites]


That said, one of my big takeaways from this whole mess is just how incredibly, if unfortunately, justified is the terrified caution in uncovering it all that Broockman talks about here.

With how almost convincing the response actually is, if it weren't for the glaring irregularities in dumb obvious shit, it really would have been incredibly foolish to attempt to take this crazy and deranged but intelligent asshole down as a humble grad student. A competent and motivated statistician would indeed be able to see through it, but there aren't so many of them and they tend to be quite busy. An investigation on purely statistical grounds would have found him out in the end, but would one have ever gotten started?
posted by Blasdelb at 9:56 AM on June 1, 2015 [3 favorites]


A competent and motivated statistician would indeed be able to see through it, but there aren't so many of them and they tend to be quite busy.

I think you may underestimate the level of statistical training many of the current generation of political scientists have, including Don Green, who is a methodologist above all else. The training is certainly sufficient enough to see through some of the LaCour's bullshit, its just that skepticism about statistical claims rarely causes one to say, "maybe he made it all up?" Even Andrew Gelman, who is both a statistician and a political scientist, looked at the data, expressed his skepticism about the effect size, but never detected a fraud.
posted by MisantropicPainforest at 10:53 AM on June 1, 2015 [2 favorites]


http://nymag.com/scienceofus/2015/06/lacour-probably-fabricated-an-integrity-document.html

"LaCour sent EGAP a document he said was proof that the first experiment had been pre-registered — according to Humphreys, it was a PDF that LaCour claimed EGAP’s website had automatically produced after he successfully pre-registered the first experiment.

The problem is that EGAP has no such automatic document-generating system."
posted by MisantropicPainforest at 11:48 AM on June 1, 2015 [3 favorites]


Another write-up of LaCour case in the Chronicle of Higher Education, this one includes interviews with UCLA faculty and students:
The survey, according to Mr. LaCour, was created using Qualtrics, an online survey-design tool. At first when questioned, Mr. LaCour said that he had used the login of a former UCLA graduate student, Colleen Carpinella, to conduct the study and that he needed her password. So Mr. LaCour contacted Ms. Carpinella, who graduated from UCLA last year and is now a postdoctoral scholar at the University of Hawaii-Manoa. She provided that password.

Mr. LaCour also sent an email to Tim Groeling, chairman of communication studies at UCLA, informing him that he was conducting a survey and needed his Qualtrics password. (The communication department at UCLA has a license to use Qualtrics; the political-science department does not.) This wasn’t an odd request. They had co-taught a course, and Mr. Groeling serves on Mr. LaCour’s dissertation committee. Mr. Groeling, who hadn’t yet learned of the accusations, sent along his login information.

Mr. LaCour then created survey panels on both accounts, according to the sources connected to UCLA. It seemed to be a last-minute attempt to make it look as if the surveys in question had actually been carried out. Mr. LaCour posted Qualtrics screenshots online to bolster his case. However, he did not include a screenshot that would have revealed when or whether the surveys had been sent.
posted by needled at 9:51 AM on June 2, 2015 [1 favorite]


Even Andrew Gelman, who is both a statistician and a political scientist, looked at the data, expressed his skepticism about the effect size, but never detected a fraud.

Fraud detection requires access to the original data, not summary statistics.
posted by Mental Wimp at 10:38 AM on June 2, 2015


Not necessarily. No one has seen the original data and a fraud was still detected
posted by MisantropicPainforest at 10:49 AM on June 2, 2015


How can fraud detection require access to the original data if the fraud is (or is alleged to be) that there aren't any original data? All the evidence (at least that which is publicly available) points to "he never collected any:" made-up people at a survey firm; no relationship with the survey firm he claimed to work with; data are identical to another dataset.
posted by entropone at 7:36 AM on June 3, 2015 [1 favorite]


I apologize. That was poorly stated. I guess I meant that individual data that were fed into the analysis need to be used, not that "original" data (whatever I thought I meant by that). Merely looking at summary results and finding them unlikely is not enough to detect fraud. The fraudster is not going to produce any data displays that will reveal the fraud. Findings that are unusual or extreme or counter-intuitive may raise suspicions, but I've seen too many real data sets that have things in them that make you go "Huh?" but are genuine. However, unless a fraudulent dataset of the original observations is generated by an evil wizard statistician, it would be extremely difficult to pass it by another competent statistician who was analyzing it carefully.
posted by Mental Wimp at 8:21 AM on June 3, 2015 [1 favorite]


MisantropicPainforest: I think you may underestimate the level of statistical training many of the current generation of political scientists have...

I thought Blasdelb was specifically not doing that when they took note of how few qualified statisticians there are to go around. I don't know about their department or school, but at my wife's, the stat analysis is usually done by statisticians working for the college who have a deep knowledge of the math, but not of the subject matter. And yes, they are quite busy and there aren't very many of them.
posted by lodurr at 9:47 AM on June 3, 2015


Maybe. But in political science statisticians are usually not brought in to analyze the data, since political scientists have more than enough training to do that. Its not a matter of knowledge, its a matter of scrutiny and trust.
posted by MisantropicPainforest at 12:27 PM on June 3, 2015


BTW, I wouldn't dispute your observation about the statistical skills of political scientists. I've had reason over the past several years to be surprised at the lack of statistical training of a variety of social scientists.
posted by lodurr at 12:37 PM on June 3, 2015 [1 favorite]


"The fraudster is not going to produce any data displays that will reveal the fraud. Findings that are unusual or extreme or counter-intuitive may raise suspicions, but I've seen too many real data sets that have things in them that make you go "Huh?" but are genuine. However, unless a fraudulent dataset of the original observations is generated by an evil wizard statistician, it would be extremely difficult to pass it by another competent statistician who was analyzing it carefully."

Well, unless they're startlingly inept, which characterizes at least a significant plurality of fraudsters and a majority of the exposed ones.

I'm just responding because it's a pattern that I've noticed among academics discussing things like this to reason from a position of their own good faith and competence, which leads them to discount hypotheses like, "Yeah, but what if the fraudster was incredibly brazen and stupid?" when arguing things like whether access to original data is necessary to detect a fraud. It would almost certainly be necessary if that academic turned to the dark side, and likewise necessary to detect the presumed actual fraudsters that likely exist uncaught now, but it's based on the assumption of competence within the field, and incompetence is often the very thing that exposes the fraud.

It's similar to, but not perfectly analogous, to the gulf between the idea of detectives solving crimes through investigation and intelligence and the reality that most crimes are only solved because so many criminals are so amazingly stupid.
posted by klangklangston at 12:49 PM on June 3, 2015 [1 favorite]


Well, unless they're startlingly inept, which characterizes at least a significant plurality of fraudsters and a majority of the exposed ones.

Well this raises its own set of questions, like: what sample of the underlying population of frauds are we seeing? What is the relationship between the base rate of frauds and the frauds we uncover?

If -- as seems to be the case -- we find out about the frauds we do because they are both inept and extremely ambitious, should we conclude that there's a whole distribution of frauds, mostly less inept and / or less ambitious?

Kinda related.
posted by grobstein at 8:43 AM on June 5, 2015 [2 favorites]




« Older Decapitated Tombstone \m/   |   “Being unseen is devastating, and so is not seeing... Newer »


This thread has been archived and is closed to new comments