Join 3,377 readers in helping fund MetaFilter (Hide)


An itinerant scholar for Bayes’ rule
December 20, 2013 4:04 PM   Subscribe

Dennis Lindley, one of the most influential of 20th century statisticians, passed away on December 14 at age 90. Lindley was a strong advocate for Bayesian statistics before it was widely popular. What is Bayesian statistics and why was Dennis Lindley important?

Bayesian statistics is a branch of statistics in which belief is treated with the laws of probability. Bayes' theorem describes how to change your mind in the face of uncertainty. In the mid-twentieth century, Bayesian statistics was rejected in favor of frequentist inference, due to Bayesian statistics reliance on "subjective" prior information.

In her book "The Theory that would not die", author Sharon Bertsch McGrayne describes Lindley's role this way:
In an era when many sneered at Bayes, it took courage to create Europe’s leading Bayesian department [at Cambridge]. Often the only Bayesian at meetings of the Royal Statistical Society and certainly the only combative one, Lindley defended Bayes’ rule like a fearless terrier or a devil’s advocate. In return, he was tolerated almost as comic relief. "Bayesian statistics is not a branch of statistics," he argued. "It is a way of looking at the whole of statistics."

Lindley became known as a modern-age revolutionary. He fought to get Bayesians appointed, professorship by professorship, until the United Kingdom had a core of ten Bayesian departments. Eventually, Britain became more sympathetic to the method than the United States, where Neyman maintained Berkeley as an anti-Bayesian bunker. Still, the process left scars: despite Lindley’s landmark contributions he was never named a Fellow of the Royal Society. In 1977, at the age of 54, Lindley forsook the administrative chores he hated and retired early. He celebrated his freedom by growing a beard and becoming what he called “an itinerant scholar” for Bayes’ rule.

Thanks to Lindley in Britain and Savage in the United States, Bayesian theory came of age in the 1960s. The philosophical rationale for using Bayesian methods had been largely settled. It was becoming the only mathematics of uncertainty with an explicit, powerful, and secure foundation in logic.
([NYT review] - [Talk by author about Bayesian statistics)

In the 1980s, Bayesian statistics enjoyed a revolution due in large part to the power of computers [PDF]. Use of Bayesian statistics has increased tremendously since then, and now is an accepted (perhaps even the dominant) paradigm in statistics. Nate Silver's work is based on Bayesian principles. Lindley's decades of tireless advocacy have been vindicated.

Perhaps Lindley's most famous contribution to statistics was his 1957 paper "A statistical paradox" [PDF] [JSTOR - wikipedia], in which he shows a key difference between Bayesian and classical inference. He has also written more recently on his philosophy of statistics [PDF] [JSTOR]

Statistician Tony O'Hagan interviews Lindley [with text] earlier this year for the Royal Statistical Society's Bayes 250 conference, celebrating 250 years since Bayes' essay "An Essay towards Solving a Problem in the Doctrine of Chances" [PDF].

Statisticians Andrew Gelman and Christian Robert pay their respects.

[previously]
posted by Philosopher Dirtbike (13 comments total) 53 users marked this as a favorite

 
.
posted by un petit cadeau at 4:12 PM on December 20, 2013 [1 favorite]


.
posted by Proofs and Refutations at 4:18 PM on December 20, 2013 [1 favorite]


. ±3σ
posted by ZenMasterThis at 4:18 PM on December 20, 2013 [14 favorites]


.
posted by Tell Me No Lies at 4:27 PM on December 20, 2013 [1 favorite]


.
posted by humanfont at 4:51 PM on December 20, 2013


. ±3σ

discussion involving Lindley here
posted by Philosopher Dirtbike at 5:02 PM on December 20, 2013


A fitting tribute.
posted by just_ducky at 5:10 PM on December 20, 2013 [3 favorites]


.
posted by Jonathan Livengood at 5:20 PM on December 20, 2013


π(.|x) ∝ P(x|.) π(.)
posted by mixing at 5:27 PM on December 20, 2013 [6 favorites]


Bayes touches so many parts of modern computing, especially artificial intelligence. I can write Bayes' theorem in my sleep.

A little more than four decades after "A statistical paradox" and two decades after Mr. Lindley's retirement, I built my first Bayesian Network and used it to train my computer to play board games. It took a while to generate a sufficient amount of training data, but it was like magic when it would beat human competitors.

Thank you, Mr. Lindley, for making sure that this valuable tool was at our fingertips when we finally had the capacity to make it do amazing things.
posted by Alison at 9:05 PM on December 20, 2013 [2 favorites]


B(.)-1∏ xi.i-1
posted by en forme de poire at 9:59 PM on December 20, 2013 [2 favorites]


In the mid-twentieth century, Bayesian statistics was rejected in favor of frequentist inference, due to Bayesian statistics reliance on "subjective" prior information.

Not by everyone. Certainly not by Jeffreys who in 1939 derived the unique non-informative prior: the Jeffreys prior.

According to Jaynes, the "Bayesian revolution" in statistics was all but over by 1976. (He also provides an interesting history on pages 7–12.)
posted by esprit de l'escalier at 8:04 AM on December 21, 2013


.
posted by lunasol at 3:16 PM on December 21, 2013


« Older And if a series of well-timed massacres by the reg...  |  Behold! The Heartbreaking, Hai... Newer »


This thread has been archived and is closed to new comments