Black Swans and The Fourth Quadrant
September 16, 2008 2:56 PM   Subscribe

THE FOURTH QUADRANT: A MAP OF THE LIMITS OF STATISTICS by Nassim Nicholas Taleb. "In the following Edge original essay, Taleb continues his examination of Black Swans, the highly improbable and unpredictable events that have massive impact. He claims that those who are putting society at risk are "no true statisticians", merely people using statistics either without understanding them, or in a self-serving manner.
posted by vronsky (40 comments total) 40 users marked this as a favorite
 
Once I saw the 2x2 matrix I knew he had to be right. The 2x2 matrix cannot lie.
posted by GuyZero at 3:11 PM on September 16, 2008 [2 favorites]


Obviously the answer lies somewhere in the fifth quadrant, on the third half of the 5th dimension.
posted by blue_beetle at 3:26 PM on September 16, 2008


I knew it. I knew we were in The Matrix. Where's my blue pill?
posted by wendell at 3:27 PM on September 16, 2008


Does this mean that, even though the chance the Large Hadron Collider will destroy the world is currently infinitessimal, in the long-long run, it's inevitable?
posted by wendell at 3:29 PM on September 16, 2008


The LHC has a strict no-black-swans visitors policy.
posted by GuyZero at 3:36 PM on September 16, 2008 [6 favorites]


A very interesting and enlightening article, thank you.
posted by aeschenkarnos at 4:04 PM on September 16, 2008


So much snark for a Distinguished Professor of Risk Engineering at New York University’s Polytechnic Institute.

Thanks for this article. Guyzero, look in the appendix if you want stuff you can't understand.
posted by anthill at 4:17 PM on September 16, 2008


No true statisticians, except for Nassim. And the time cube guy, of course.
posted by delmoi at 4:24 PM on September 16, 2008 [1 favorite]


Yeah, I'm not understanding all the snark. I thought it was an interesting and timely essay (if a bit over my head) and The Black Swan was very well received book when it came out, so I was hoping some of the mefi math brains would chime in. Maybe it was my title? Oh well.
posted by vronsky at 4:28 PM on September 16, 2008


Look at this model I made showing why it is wrong to trust models.
posted by Pyry at 4:28 PM on September 16, 2008 [4 favorites]


I know some people at A Major Insurance Firm who do nothing but enterprise risk -- the kind of singular events that collapse tens of billions of dollars and could send the largest firms out of business. People know about this and take it seriously.
posted by a robot made out of meat at 4:31 PM on September 16, 2008 [1 favorite]


This seems so obviously true on one level; I look forward to reading his book. Thanks for the link.
posted by one_bean at 4:32 PM on September 16, 2008


Guyzero, look in the appendix if you want stuff you can't understand.

I'm not sure if that's an insult or merely cryptic but I'll play it safe and assume it's an insult. In retort, let me state that what I do not understand could fill a very large book, say, with several pages for each proton in the universe.

My comment was less snark about the author, as the article is indeed fine, but perhaps more of a comment about the vastly-overused 2x2 matrix. I tend to associate them with poorly thought our industry analysts and consultants and their ilk who pass off their bland observations as insight. Let me state for the record that the linked article hardly falls into the bland observation category.
posted by GuyZero at 4:46 PM on September 16, 2008


The Black Swan was one of the most interesting and challenging books I read this year.
posted by wobh at 4:59 PM on September 16, 2008


I'm not going to pretend understand it.

But I am going to pretend I wrote it.
posted by Lacking Subtlety at 5:01 PM on September 16, 2008




I've got this in the "broken swan is right twice a year" category. If you predict a lot of catastrophic failures, some of them will come true. That doesn't make you good at predicting catastrophic failures.
posted by grobstein at 5:19 PM on September 16, 2008 [1 favorite]


So much snark for a Distinguished Professor of Risk Engineering at New York University’s Polytechnic Institute.

Perhaps, but it's not very clearly written, or at least is written to a finance crowd with bombastic verbiage, and it really seems like much of it can be boiled down to the understanding that at the edge of a statistical distribution ("tail"), it is not only possible to get false positives, but that accepting the truth of false positives can lead to bad effects (including "black swans").

It's not enough just to say your statistical tests are significant, but I'm not sure how groundbreaking that idea really is, to be honest. I'm fairly sure most statisticians understand that idea.
posted by Blazecock Pileon at 5:29 PM on September 16, 2008


My black swan's broken window has a tipping point that blinks correctly twice a day.
posted by GuyZero at 5:30 PM on September 16, 2008 [6 favorites]


“merely people using statistics either without understanding them, or in a self-serving manner.”

This is half bullshit. 90% of smart people know that. Right all you smart folks?

“Yet the system was getting riskier and riskier as we were turkey-style sitting on more and more barrels of dynamite...”

Er, meleagris and trinitrotoluene metaphors aside (would anyone put dynamite in a barrel? If so, why? I mean you could do that with Anfo and cast boosters but...I digress)...this:

“If small probability events carry large impacts, and (at the same time) these small probability events are more difficult to compute from past data itself, then: our empirical knowledge about the potential contribution—or role—of rare events (probability * consequence) is inversely proportional to their impact.”

is interesting. It seems almost Godelian in that there’s an inherent unpredictability by any given model.
I’d agree with the confirmation bias thing, but I’d add - in the vein where he gets into positive advice in terms of the “how to” stuff - that humans are very imitative in learning.
So that bit of it can be a hazard beyond just valuing the “I don’t know” realm.
(And boy people hate to say “I don’t know”)

Speaking of which - most of this is over my head, especially some of the heavier details. But I’ve worked with this kind of thinking, and I’ve always loved statistics, so I can get the basic gist. (Or y’know, completely blow the basic gist, but I still like this stuff).
posted by Smedleyman at 5:40 PM on September 16, 2008


GuyZero, no insult intended - at first, the article gave me the same impression of pop-sci, metaphor-drizzling, but by the end I was nodding, and the appendix was proof that I probably missed a bunch of smart stuff that went over my head.

Maybe my enjoyment was a bit biased: I study complex systems with human components, and a lot of the work that we do goes into preventing very rare, catastrophic events through adding redundancy (and cost). His ranting about the eternal temptation to let statistical models cover your ass and optimize,optimize,optimize struck a chord with me.

Besides, I love that word kurtosis!
posted by anthill at 5:53 PM on September 16, 2008


Grobstein: Ironically, I think he would completely agree.

Who is anyone to predict anything in the 4th quadrant?
posted by anthill at 5:59 PM on September 16, 2008


Rereading my post I can see how people thought this was a time cube guy. My bad. I first heard about him when Charlie Rose did an hour with him last year after the success of The Black Swan. Here is his bio - "Taleb is currently a researcher at London Business School. He the Dean’s Professor in the Sciences of Uncertainty University of Massachusetts at Amherst, Fellow in Mathematics in Finance, Adjunct Professor of Mathematics at the Courant Institute of Mathematical Sciences of New York University (since 1999), and research fellow, Wharton School Financial Institutions Center, and Chairman, Empirica LLC.

Taleb held senior trading positions with trading houses in New York and London and operated as a floor trader before founding Empirica LLC. His degrees include an MBA from the Wharton School and a Ph.D. from the University of Paris. He is the author of Dynamic Hedging , Fooled by Randomness, and The Black Swan."
posted by vronsky at 6:10 PM on September 16, 2008


His book the black swan is very good and I really like his ideas about risk and the inability to model Exponential distributions using normal distribution statistics. Can we say that this distribution is not non-normal, if so not statistics apply.
posted by Rubbstone at 6:11 PM on September 16, 2008


Taleb is a very smart man. Any snark is best withheld from his ideas and spent on the way he presents them, as he does come across as kind of an asshole. Thanks for this, I probably would have missed it otherwise.
posted by adamdschneider at 6:14 PM on September 16, 2008 [1 favorite]


I study complex systems with human components, and a lot of the work that we do goes into preventing very rare, catastrophic events through adding redundancy (and cost)

If you have not already read it you will enjoy The Human factor by Kim Vicente a lot then. It is not about financial market failures but engineering and systems failures which is a bit more tractable to me at least although I will grant that Taleb's black swans are a heck of a log bigger than the occasional plane crash.
posted by GuyZero at 6:18 PM on September 16, 2008 [1 favorite]


Wait, you're in Toronto... I think this may have come full circle. If you're one of Vicente's grad students you can skip reading the book. :)
posted by GuyZero at 6:19 PM on September 16, 2008 [1 favorite]


Taleb gets a bad wrap because he's arrogant but what he's saying is fundamentally correct:

(1) Risk/Liabilities cannot be quantified with the same sort of precision and,
(2) Finance / economics is much more complex than we realize and we are so far removed from understanding it from a rigorous scientific framework that we run the risk of pre-modern medicine of doing more harm than the patient in the insistence of doing something rather than nothing.

The problem is assuming that the mathematics of other disciplines will necessarily hold up. This seems to be countered by the fact the mathematics gets even more complex and fails even more spectacularly. Instead of pointing out that it will fail, I'm curious as to why it works as long as it does.

If stochastic calculus, obscure probability distributions and fancy footwork is so wrong when applied to disciplines where it is not meant to be applied, why does it do such a good job at fooling us? Is there some sort of deeper epistemological truth about mathematical logic? I would say yes, there is, and that's where the real fascination is. These aren't just nerds pounding away like a black box, spewing out equations they learned at MIT. These equations work, and do so spectacularly. In fact, there's no reason that these models wouldn't continue to work. There's no reason they failed when they did, I have a feeling there's a whole lot of models out there that were never falsified. And if you take falsification out of the picture, you can never falsify everything, than how are you sure about anything? If you only have one coin, how do you know it is not magic because it lands heads up every time you flip it. Because there's a lot of times when you only have one coin, and you can't apply rigorous testing to it. Can we never be certain about anything about it? Do we not know it beyond its aesthetic properties?

I'm tired and that may be just bullshit, but does anyone believe that when the next round of wunderkids comes around and can do spectacular magic with numbers that actually works, that Wall Street will tell them know and quote that the greatest thing they know is that they know nothing. No they'll take the money. And to everyone it will look like this time is different. Until we find out it is not, and we'll slap our heads and go "Oh Taleb! You're right, it is so obvious looking back." Greed will always trump skepticism if only because it gets you laid.
posted by geoff. at 6:23 PM on September 16, 2008 [4 favorites]


HEY CAN WE PANIC YET
posted by spiderwire at 6:26 PM on September 16, 2008


I adored all of this that I was able to comprehend.
posted by Navelgazer at 7:13 PM on September 16, 2008


this is excellent, thank you so much for posting about it.
posted by noway at 7:27 PM on September 16, 2008


This is an excellent article, metafilter's snark is definitely scoring a false positive here. I suggest you actually RFTA before trying for favourites.
posted by mek at 7:27 PM on September 16, 2008 [1 favorite]


I've done a bunch of backtesting of automatic trading strategies and they all bear out one of his key points, which is that highly optimized strategies fail spectacularly on out-of-sample data. A black swan event is by its very nature out-of-sample. Whereas a mediocre, unoptimized strategy which churns along producing middling returns across the board tends to recover robustly from the black swan.
posted by unSane at 7:33 PM on September 16, 2008


The concept is pretty simple: if you don't know where the edge is, and you decide to dance next to it, you're going to fall off... and most models without the right kind and quantity of inputs don't have enough accuracy to tell you where the edge is. A whole bunch of math ensues, but it's just proving the point about models with too many assumptions and not enough data, and the magnitude of smackdown it can lay on you as it entices you closer and closer to where it thinks the edge is.

I like the part about the call for papers for "Operating in a low-predictability environment." Most of the submissions were cranks who thought they had a better way of generating models, missing the object entirely. The right answer would be a mathematical study of military science: Dig in, buckle down, go after the sure things with vigor, but be ready to back out at the first sign of trouble, and you always expect trouble. Financial investors could do with fewer quants and more people who've read and understood Clausewitz... or better yet, Quants who've read and understood Clausewitz and this guy.

In the end, tho, I wonder if we're looking at the wrong models? Perhaps the model to blame is this one: get away with as much as you can for as long as you can and retire rich, laughing at those who must clean up after you.
posted by Slap*Happy at 7:36 PM on September 16, 2008 [2 favorites]


get away with as much as you can for as long as you can and retire rich, laughing at those who must clean up after you.

Geez. Now we've got a fifth "Bush Doctrine" for Krauthammer's list.

Seriously though, it seems as tough I'm not the only one here who loved the article, but is a little shaky on understanding all of the distinctions. Is there anyone here with a greater background who wouldn't mind giving some clearer examples of M0 vs. M1+ and M2+, or Mediocristan vs. Extremistan? Or what, for instance, might be considered in the "quite robust" third quadrant of M0/Extremistan?
posted by Navelgazer at 7:54 PM on September 16, 2008


I feel Juvenal deserves a mention for originating ("rara avis in terris nigroque simillima cycno") the black swan metaphor.
posted by Phanx at 1:44 AM on September 17, 2008 [1 favorite]


Taleb, looking at the cataclysmic situation facing financial institutions today, points out that "the banking system, betting against Black Swans, has lost over 1 Trillion dollars (so far), more than was ever made in the history of banking".

Taleb's trademark use of Black Swan is tiresome.

The banking system was not betting against improbable, unpredictable events - c'mon man - it was betting against the house with the full knowledge that the house would loan it however much cash it needed to keep on betting. The house encouraged bad behaviour on the part of its gamblers and then underwrote - or relieved them of - their losses.

This is not a "black swan" event - we've seen these swans of color before. The house burned itself in what is an entirely probable, if not predictable outcome. Even the timing - so close to the election - must have been close to the statistical mean.
posted by three blind mice at 3:01 AM on September 17, 2008


Excellent piece -- but it must be said, his actual prose is extremely rough. He needs a good editor to give it a once-over. I think it would improve the piece's comprehensibility exactly 53%.
posted by luckywanderboy at 3:29 AM on September 17, 2008


A classic example of the "black swan" scenario and hubristic "turkey" reasoning is provided by the Galveston Hurricane of 1900 - the chief of the weather bureau extrapolated from known data and concluded that the city was safe, and that the very notion that it may not be was "an absurd delusion". see Isaac's Storm (google vid)
posted by dinsdale at 7:41 AM on September 17, 2008


This essay is in line with others I've seen from edge.org: brilliant, thought-provoking, and desperately in need of a few rounds of editing to become more readable.

There are absolutely times when the correct interpretation of a rigorous statistical analysis is "these data give me no information about the hypothesis I was hoping to test." But that's a hard statement to sell to your boss.
posted by fantabulous timewaster at 10:52 AM on September 18, 2008


« Older I'd contribute to her defense if she gets caught.   |   "I spent 16 years in prison for a crime I didn't... Newer »


This thread has been archived and is closed to new comments