Don't Even Think About Lying
January 5, 2006 3:01 AM   Subscribe

Don't Even Think About Lying fMRI is poised to transform the security industry, the judicial system, and our fundamental notions of privacy. I'm in a lab at Columbia University, where scientists are using the technology to analyze the cognitive differences between truth and lies. By mapping the neural circuits behind deception, researchers are turning fMRI into a new kind of lie detector that's more probing and accurate than the polygraph, the standard lie-detection tool employed by law enforcement and intelligence agencies for nearly a century.
posted by robbyrobs (62 comments total)

 
I think this technology is destined for failure.

Or maybe I'm lying.
posted by LondonYank at 3:09 AM on January 5, 2006


"Chance believes the virtues of what he calls "a network to detect malevolence" outweigh the impact on personal liberties. "

It's like all our dreams come true!

If they do make this happen I hope that each detector is fitted behind a giant screen suspended from the ceiling with huge melevelant eyes staring down upon those who pass.
posted by Meccabilly at 3:20 AM on January 5, 2006


I was going to say something about the use of first-person in the FPP, because without quotes or italics it came across as self-link. Turns out it's not a self-link at all, but yet another reason I wish I had digaman's job.

I'm due — well, overdue, really — for an MRI. I wonder if I'll be upgraded to fMRI this time.
posted by emelenjr at 3:28 AM on January 5, 2006


Just this morning there was an article in the (London) Metro about a new system for lie detection based on using a laser to map miniscule changes in body temperature, muscle tension etc. to provide a more accurate and unnoticeable way to see if someone is lying. The difference to fMRI being that this can be set up without the subject knowing they're effectively "taking" a lie detector test.

Can't for the life of me find the online version of this at the moment.
posted by slimepuppy at 3:37 AM on January 5, 2006


Just this morning there was an article in the (London) Metro about a new system for lie detection


Could it be this?

"He also pointed to trials of "intelligence vision systems" - enhanced CCTV which can automatically spot suspicious behaviour such as someone leaving a package.

But most experts at the conference felt that wide use of such technology on the transport system was extremely unlikely in the foreseeable future. "
posted by Meccabilly at 3:45 AM on January 5, 2006


Similar, but I don't think it was that.
I know the Metro's quite rubbish and slow on the uptake, but that bbc article is from November...
posted by slimepuppy at 3:55 AM on January 5, 2006


This?
posted by Meccabilly at 4:02 AM on January 5, 2006


We could be here awhile.

Can't find it on the dailymail or evening standard website, much less the appalling metro website.

Ah, well...
Frickin' laserbeams.
posted by slimepuppy at 4:17 AM on January 5, 2006


The only drawback to this is that in order to tell if you're lying or not, they actually have to extract your brain from your skull. Unfortunately, subjects are not too chatty afterwards... (which really only indicates their guilt. And so, in the end, the new machine works flawlessly.)
posted by crunchland at 5:34 AM on January 5, 2006


There's a pretty fascinating (and flawed) sf novel called The Truth Machine, which talks about the way a 100% accurate device for identifying lies would totally transform the world. The machine is developed by private corporations after US Congress offers an enormous financial incentive to do so.

Machines like this will only prove truly useful - for courts, etc. - if they are 100% accurate. And that's a pretty insane standard to measure up to.

Later this year, two startups will launch commercial fMRI lie-detection services, marketed initially to individuals who believe they've been unjustly charged with a crime.
This really, really bothers me. Exploiting the most desperate people in society. :(
posted by Marquis at 5:35 AM on January 5, 2006


I have a tool more accurate than the polygraph already. It's called a "quarter" and I flip it.

There's a pretty fascinating (and flawed) sf novel called The Truth Machine

Perfect choice of words - it's both terrific and horrible, and I'm amazed someone else has read it.
posted by Optimus Chyme at 6:10 AM on January 5, 2006


robbyrobs, thanks so much for the FPP to my article.

Machines like this will only prove truly useful - for courts, etc. - if they are 100% accurate. And that's a pretty insane standard to measure up to.

Well, in an ideal world, yes. But the history of the polygraph suggests something quite different. The polygraph is nowhere near 100 percent accurate, and its effectiveness is highly dependent on the interrogation skills of the examiner. While polygraph evidence has been excluded from most US courts, the device is still used by intelligence agencies like the CIA, the NSA, etc., to vet their own employees, and the polygraph is being used more than ever in places like Iraq and Guantanamo Bay to interrogate detainees and confirm the loyalties of Iraqi officers, etc. -- and reliance on it has been blamed for a number of significant intelligence failures. But, many would say, when a device might uncover data that would prevent a future terrorist attack, shouldn't it be used?

The truth is that even a highly inaccurate technology for lie detection can still find thriving markets in a vast gray area outside of the US courts. A dubious lie-detection technology called voice-stress analysis -- the accuracy of which was called "dismal... consistently less than chance... you could have gotten better results by flipping a coin" by researchers funded by the Defense Department -- is also being used to interrogate detainees.

In the lie-detection business as it has unfolded over the last century, 100 percent accuracy is not required.

One of the many disturbing truths that come up once you start looking at these issues.
posted by digaman at 6:11 AM on January 5, 2006


Reason ran an aricle about fMRI and The Truth Machine (by James Halperin) back in Nov 2001; at the end, it raises an interesting question about fMRIs, the Fifth Amendment and self-incrimination.
posted by mediareport at 6:15 AM on January 5, 2006


Btw, would any of you who've read Halperin's book care to offer a mini-review? A friend recommended it a while ago and I'm curious about both its fascinating and flawed elements.
posted by mediareport at 6:18 AM on January 5, 2006


If this technology is ever made near-perfect, it's the ultimate tyrrany. Especially in this world, where so many of us commit "serious" crimes every day.
posted by I Love Tacos at 6:20 AM on January 5, 2006


There's a pretty fascinating (and flawed) sf novel called The Truth Machine

I thought I was the only one to have read this... weird.
posted by Jezztek at 6:28 AM on January 5, 2006


Lie detection is only the beginning. Studies have shown that brain imaging can be used to detect anything from racism to alcoholism. Imagine your local police department vetting cadets based on how they respond to the racism test.
posted by Hobbacocka at 6:38 AM on January 5, 2006


Well, it's not like this could actually get to the lies that really keep society going, the ones we tell ourselves.

He hits me because he loves me. I'm doing everything I can. Just one more, then I'll stop. My lifestyle is consistant with my beliefs.

PS Hobbacocka, I read your comment to mean that the police would only take candidates with a certain degree of racism (otherwise, they just wouldn't fit in) but I realize that it could have been intended otherwise.
posted by allen.spaulding at 6:51 AM on January 5, 2006


Accurate lie detector tests would mean that only people in power would get to lie. Imagine the difficulty of getting Bush or Cheney to submit to one performed in public under "impartial expert" monitoring and you'll see what I mean.
posted by davy at 6:52 AM on January 5, 2006


Hobbacocka : "Imagine your local police department vetting cadets based on how they respond to the racism test."

I had a writeup about that at Plastic.
posted by Gyan at 6:53 AM on January 5, 2006


There is no sanctuary!
posted by rolypolyman at 6:54 AM on January 5, 2006


This won't work. People will figure out new ways to lie that don't involve the 'deception centers' or whatever. Or more likely, they'll run those parts of their brain all the time, by imagining themselves lying as they tell the truth.

In fact, I bet this thing is even easier to beat then a regular lie detector.
posted by delmoi at 6:55 AM on January 5, 2006


People obsessed over eye contact are always lying.
posted by HTuttle at 6:56 AM on January 5, 2006


[this is good]
posted by Rothko at 6:58 AM on January 5, 2006


Wow, I'm amazed fMRI's have gotten so cheap that people are considering this to be a financiaolly wise service. Digaman do you know how many false positive there were in these tests? That would be the thing that wold really scare me. Sure it would be nice if the machine caught 99% of liars, but it would be useless if that machine thought 5 percent of people telling that truth were lying.

hell, Even now several years after my 21st birthday I get a guilty feeling when bouncers ask for my ID at bars.
posted by afu at 7:01 AM on January 5, 2006 [1 favorite]


I just heard of a study on how brain imaging can reveal how the brain responds to branding. Introducing: neuromarketing.

Or how about detecting sexual orientation (the effort to repress sexual arousal leaves its own unique signature). The military might be interested in that. Wall Street would surely be interested in detecting whether prospective employees have a gambling streak (easily detectable), just as college admissions offices might be interested in new, more exact ways of measuring intelligence. And what about medicine? Consider: a new, brain-based taxonomy of mental illness -- taking the guesswork out of psychiatry!

Like I said, lie detection is just the beginning.
posted by Hobbacocka at 7:03 AM on January 5, 2006


Langleben developed a hypothesis that in order to formulate a lie, the brain first had to stop itself from telling the truth, then generate the deception - a process that could be mapped with a scanner. Functional imaging makes cognitive operations visible by using a powerful magnetic field to track fluctuations in blood flow to groups of neurons as they fire. It reveals the pathways that thoughts have taken through the brain, like footprints in wet sand.

So is this saying the thing has a positive test for lying and for truth telling? That could be a relief for us Joseph K. types who exhibit the signs of lying when telling the truth because we expect we won't be believed.
posted by PinkStainlessTail at 7:11 AM on January 5, 2006


Hobba, thanks for bringing up those other uses of fMRI, which I wish I had had the space to discuss in my article. All of those other factors could definitely be brought into play in, say, a child custody trial, or a parole hearing as a measure of "future dangerousness," or...

As to where this could all go someday, see this marvelously unsettling site, which is an attempt by designer Luther Thie and a colleague to show what could happen when a "trusted class" of people is created by technology like this.

Acclair’s special relationship with Government offers a golden opportunity for its members to be approved trusted citizens. The extensive enrollment procedure collects all private, travel, governmental and medical information. This information is aggregated and algorithmically associated with security and neuromarketing stimuli for use in the Brain Fingerprinting (BFP) tests. All past deeds should be declared at this moment in order to be cleared of them and to assure they will not be surprisingly revealed in future BFP sessions (which could cause unforeseen delays and unwanted appointments with law enforcement officials). In this way, Acclair Amnesty signifies a “clean slate” and a new trusted status, an entry into the Acclaired Class.
posted by digaman at 7:17 AM on January 5, 2006


Consider: a new, brain-based taxonomy of mental illness -- taking the guesswork out of psychiatry!

I'd welcome that, actually. I figure I'm probably nuts, but I don't know which flavor.
posted by alumshubby at 7:22 AM on January 5, 2006


digaman: Hobba, thanks for bringing up those other uses of fMRI, which I wish I had had the space to discuss in my article.

In truth, I was lying in that same scanner at Joy's lab not so long ago myself, researching an article on the psychiatric angle for New York magazine. Scooped me, damn ya.
posted by Hobbacocka at 7:29 AM on January 5, 2006


Heh. Well, tell Adam Moss I say hi. We're old friends. :)
posted by digaman at 7:30 AM on January 5, 2006


And definitely talk to Langleben. He's got much more on the psychiatric angle than I had space for in my piece.
posted by digaman at 7:31 AM on January 5, 2006


Consider: a new, brain-based taxonomy of mental illness -- taking the guesswork out of psychiatry!

Uh, what would be bad about that?
posted by delmoi at 7:37 AM on January 5, 2006


That it won't work?
posted by digaman at 7:39 AM on January 5, 2006


Idiots with expensive toys.
posted by c13 at 7:40 AM on January 5, 2006


you're thinking of a . . . brick wall
posted by realcountrymusic at 7:43 AM on January 5, 2006


This just sparks my memory that New Scientist over the last year (unfortunately no public online archive) has been running a series of articles raising questions about the accuracy and reliability of forensic tests like fingerprints and gun shot residue. With both tests there is significant possibility of error, which the forensic investigators using the tests in court won't quantify.
posted by KirkJobSluder at 7:45 AM on January 5, 2006


Nice article, even with the Big Brother overtones. Britton Chance is a hell of a guy, huh?
posted by mbd1mbd1 at 7:53 AM on January 5, 2006


Britton Chance is a hell of a guy, huh?

He sure is. 92 years old, and when I asked him if he still goes sailing every weekend, he said, "Of course, what the hell!"
posted by digaman at 7:56 AM on January 5, 2006


While polygraph evidence has been excluded from most US courts, the device is still used by intelligence agencies like the CIA, the NSA, etc., to vet their own employees, and the polygraph is being used more than ever in places like Iraq and Guantanamo Bay to interrogate detainees and confirm the loyalties of Iraqi officers, etc. -- and reliance on it has been blamed for a number of significant intelligence failures. But, many would say, when a device might uncover data that would prevent a future terrorist attack, shouldn't it be used?

I guess, if you like relying on potentially/probably inaccurate information. Learning how to beat a polygraph is a pretty trivial task, and just about anyone can do it. Its use as an information-gathering tool is frankly insane.

The truth is that even a highly inaccurate technology for lie detection can still find thriving markets in a vast gray area outside of the US courts. A dubious lie-detection technology called voice-stress analysis -- the accuracy of which was called "dismal... consistently less than chance... you could have gotten better results by flipping a coin" by researchers funded by the Defense Department -- is also being used to interrogate detainees.

Which goes to show that the people running law enforcement agencies are often out of their goddamn minds. Remember when the CIA researched remote viewing? Yeah, thanks for wasting resources, guys.

In the lie-detection business as it has unfolded over the last century, 100 percent accuracy is not required.

As a business, no, it's not. Hell, you could probably rig up a machine to output random data and someone would buy it. As a tool of law enforcement, I'd like to see it get to at least 90% accuracy, which may never happen. And even then I don't think it should be admissible in court.

Mediareport: thanks for the Reason link. I'm still shocked that it's apparently widely-read.
posted by Optimus Chyme at 8:03 AM on January 5, 2006


As a tool of law enforcement, I'd like to see it get to at least 90% accuracy, which may never happen.

It already has with fMRI lie detection, which has shown 90 percent accuracy and more, but in lab tests, which are obviously not the real world of crime and terrorism. One of the questions I hoped to raise by writing this article was that difference.
posted by digaman at 8:07 AM on January 5, 2006


Remember when the CIA researched remote viewing? Yeah, thanks for wasting resources, guys.

Well, hundreds of hospitals and clinics are not currently using "remote viewing" to map the brains of cancer patients before surgery. fMRI has a bit more credibility, at least in that context, than remote viewing!
posted by digaman at 8:12 AM on January 5, 2006


With [forensic] tests there is significant possibility of error, which the forensic investigators using the tests in court won't quantify.

For very good reason. If you put any indication of uncertainty in your results in front of a court, the opposition lawyer has a field day with it. It's utterly bad science, but that's how legal truths (which have to be black and white) pervert scientific ones (which are always probabilistic).

And don't try to tell me otherwise: court cases are lost when a lab reports good QC and supplies an uncertainty estimate. Nowadays, labs are directed to supply prosecutors with only the raw instrument data and final results: no statistical interpretation, because that just gives the opposition wiggle room. It demeans and debases the science, but that's what gets the convictions.
posted by bonehead at 8:23 AM on January 5, 2006


"Functional imaging makes cognitive operations visible by using a powerful magnetic field to track fluctuations in blood flow to groups of neurons as they fire."

I'd like to see some conclusive proof that this is safe, before it starts being used on prisoners or job applicants. Not only that it does no "harm" according to the researchers' standards but that it causes no lasting changes at all.
posted by jam_pony at 8:26 AM on January 5, 2006


I'd like to see some conclusive proof that this is safe, before it starts being used on prisoners or job applicants. Not only that it does no "harm" according to the researchers' standards but that it causes no lasting changes at all.

Well, MRI -- which uses the same magnetic fields as fMRI (in fact, it uses the same machines) -- has been in wide use for 20 years or so. The primary danger of it is that metallic objects in the body of the patient, or in the scanning room, can be impelled toward the Humongous Magnet, which has caused several deaths. But, as far as medical procedures go, fMRI seems safer than, say, X-rays, which are routine.
posted by digaman at 8:35 AM on January 5, 2006


digaman : "But, as far as medical procedures go, fMRI seems safer than, say, X-rays, which are routine."

What about psychological changes? Have there been long-term rigorous studies on the effects of MRIs?
posted by Gyan at 8:42 AM on January 5, 2006


Gyan, there have been many studies on the effects of MRI, and they have concluded that it is a safe technology as it is used. Utrecht University researchers just concluded a study of workers who help manufacture MRI scanners, and thus are exposed to very high levels of magnetic field for much longer than patients would be, and they concluded, "This study suggests that any effects on cognitive functions are acute and transient and disappear rapidly after exposure has ended."

Does that mean that MRI is 100 percent safe? Not at all. But it seems safe within a reasonable range of expectation.
posted by digaman at 8:55 AM on January 5, 2006


I imagine this is the facial laser scanning lie-detection technique mentioned above.
posted by Rumple at 9:17 AM on January 5, 2006


digaman, I'd like to see the full paper, if possible. I'm particularly interested in what "cognitive performance" means. My concern is more about long-term personality changes rather than basic cognitive ability.
posted by Gyan at 9:24 AM on January 5, 2006


Gyan, alas, I don't have it. Researching the basic safety of MRI was not my focus, since it's so well established in the medical world. But I think your questions are important and valid.
posted by digaman at 9:30 AM on January 5, 2006


It already has with fMRI lie detection, which has shown 90 percent accuracy and more, but in lab tests, which are obviously not the real world of crime and terrorism. One of the questions I hoped to raise by writing this article was that difference.
posted by digaman at 8:07 AM PST on January 5


Yeah, that's what I'm more concerned with. I should have been more clear. Nice article, though; anything that helps get the polygraph out of the system is an improvement to me.
posted by Optimus Chyme at 9:30 AM on January 5, 2006


Rumple, yes, yes it is.
Thanks.
posted by slimepuppy at 9:36 AM on January 5, 2006


A polygraph is not used as much for accurate readings as it is for a tactic.

"We know you're lying! We have the lie detector tests right here! If you wanna make it easier on yourself, you'd better start talking NOW."

If this MRI stuff works as well as it sounds like it does, however, it could certainly change the landscape in very sinister ways.

It's like all those sci-fi shows set in the future, with zero privacy and retinal scanners and all that. People tend to forget that those futures always suck, though, and the movie ends up being all about trying to get the hell out of there...
posted by First Post at 9:45 AM on January 5, 2006


The Acclair link I posted earlier is very successful science fiction, showing how such a totalitarian future will be... bright, shiny, and sexy!
posted by digaman at 10:03 AM on January 5, 2006


Does this mean we can stop torturing now?
posted by elwoodwiles at 10:09 AM on January 5, 2006


I... am going to have to call "not quite bullshit" on this one. fMRI is constantly hailed by people who don't actually use it as a sort of super mind-reading thing.

Well, the truth is that it is an amazing, useful, but quite limited tool - limited by current technology (i.e. the resolution and speed of the imaging), limited by current knowledge (i.e. we are far from understanding even close to fully what goes on in the brain to create certain human states like deception), and limited by the nature of the thing (i.e. it measures only the intensity of activity in an area, and there is more to the mind than that).

When you hear about someone saying "these guys are mapping the neural circuits for lying/love/emotions/criminal intent/miscellaneous," that is usually a mark of exaggeration. The knowledge gained by such a study would be localized to the subjects of the study since brains differ so vastly in their "compositions." Maybe in 50 years we will have the power to extrapolate results reliably to other people from a study on a few subjects' brains but in the meantime no practice like the "ultimate lie detector" is even close to reality.
posted by BlackLeotardFront at 12:12 PM on January 5, 2006


The knowledge gained by such a study would be localized to the subjects of the study since brains differ so vastly in their "compositions."

Yes, that's why it's good to have different teams of researchers in different pleaces using different experimental tasks attempt to replicate each other's results. There has been a tremendous amount of overlap in the results of studies of fMRI and deception in terms of the brain areas activated, which is at least a promising sign that there's something to this. That's why I wrote the article. Unlike many fMRI "breakthrough" stories you'll read, this one was not inspired by a single study, but by many more studies than I even mentioned in the piece. But I appreciate your cynicism about such stories, and share it.
posted by digaman at 1:34 PM on January 5, 2006


*places
posted by digaman at 1:35 PM on January 5, 2006


Very interesting post. I ran across this concept in one of my classes last semester, and think it's just fascinating what we are discovering about the brain through various scanning techniques.

I'd heard that MRIs are relatively safe, so I was surprised to learn that there are laws in Europe which will restrict their use. The article I read suggested this was based on certain side effects experienced at high dosage and theoretical risks.
posted by moira at 2:00 PM on January 5, 2006


As someone who used to do fMRI research and now does law, I found the article interesting and better than most on the topic. However, I have to share BlackLeotardFront's skepticism about how unlikely the scenarios in the article are, at least any time soon...
as fascinating and sexy as fMRI research is, my experience with it was that it is still in its infancy, relatively speaking, and it will be a long time and thousands of experiments before it comes close to solving the problems the general public is led to believe it will solve. The reasons have been mentioned: it is a *huge* financial investment...I had to laugh at the quote predicting 20 centers in every major city in the county, fully staffed with the equipment, cognitive neuroscientists, and MRI techs and interrogators...so who's paying for that, exactly? The speaker acknowledges the expense, then justifies it in some way by asking, "but what's the cost of a six month jury trial?" I can tell you, having some experience with both, a lot less than 20 fMRI centers in every major city in the country...

I also think the predictions are taking a very narrow view of the legal system. First of all, assuming the resources are in place, I find it unlikely that there is going to be a strong judicial trend toward forcing an unwilling suspected liar to have an fMRI. Barring an extra-judicial process (which is more fathomable lately but still unlikely), this limits the technology to the willing. Even assuming there was legal authority to force someone to undergo the scan, there is precious little anyone can do from preventing that person from moving their head/jaw/etc. around during the scan-- as everyone with fMRI research experience knows, absolute stillness is critical to producing reliable results.

I think what it comes down to, in my opinion, is that maybe its not the fMRI potential utility that is overblown, it is the emphasis on the perceived importance of detecting deception. In the case of a suspected terrorist, I think that if government agents truly believe someone is a terrorist, the last thing they're going to care about doing is whisking that person off to a fMRI center to let scientists detect if he's lying or not...
in the case of a criminal defendant, it's the same idea-- fMRI results are going to be just another piece of evidence to be accorded however much weight a juror feels is appropriate (I don't think anyone is seriously suggesting that the fMRI results would replace the jury system)-- and in that case, more often than not, if a juror believes a defendant is guilty, they're going to find a way to write off the fMRI evidence. Finally, while the scientists might like to believe that fMRI results are going to be an equivalent of DNA in terms of exonerating the wrongly convicted, I think that is a really misguided way of thinking about this. The concept of lying and truth telling as measured by brain activity is just must more nuanced than the concept of lying and truth telling as measured by the presence or absence of DNA at a crime scene.
posted by Harvey Birdman at 4:53 PM on January 5, 2006


All excellent points, Harvey.

I do want to point out that Britton Chance's technology, described in the "Cortex Cop" sidebar, is much cheaper than fMRI. But yes. It'll be interesting to see how the two companies that are planning to offer this service play out.
posted by digaman at 5:49 PM on January 5, 2006


"Later this year, two startups will launch commercial fMRI lie-detection services, marketed initially to individuals who believe they've been unjustly charged with a crime."
This really, really bothers me. Exploiting the most desperate people in society. :(
- Marquis

Yeah. And only the wealthy ones will be able to benefit from it back up their story. The poor ones will have to hope someone believes them.
posted by raedyn at 10:52 AM on January 25, 2006


« Older Toys! Flickr Toys, that is....  |  Wannaspell... Newer »


This thread has been archived and is closed to new comments