Cheat Sheet for the Brain
January 24, 2017 1:58 PM   Subscribe

The Cognitive Bias Cheat Sheet. Derived from the wiki article, the cheat sheet consolidates and classifies 175 biases into four groups. "Every cognitive bias is there for a reason — primarily to save our brains time or energy. If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce. [The] four problems that biases help us address: Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later."

The result is presented as a beautiful infographic available as a poster here.

Bonus: Cass Sunstein, co-author of Nudge, gives his latest talk on Nudges (and healthcare) at the recent Canada 2020 summit.
posted by storybored (20 comments total) 99 users marked this as a favorite
 
Thank you. I have been seeking something like this. I really like his taxonomic approach.
posted by Thella at 2:27 PM on January 24, 2017


Funny, I kinda know the guy behind this and saw he posted it a few weeks ago.
posted by black8 at 2:35 PM on January 24, 2017


Awesome!! I gave several internet research loving teens I know this "logical fallacies" poster for Christmas. Looks like the perfect follow up.
posted by chapps at 2:38 PM on January 24, 2017 [3 favorites]


one of my cognitive biases is that i have irrational hatred of textual information presented radially so i won't be buying that poster
posted by murphy slaw at 2:40 PM on January 24, 2017 [7 favorites]


Warning: May actually make you fitter, happier, and more productive.
posted by OverlappingElvis at 2:49 PM on January 24, 2017




there should be a better name than "cognitive bias", which sounds pejorative
posted by thelonius at 3:18 PM on January 24, 2017


This was interesting. But I got derailed by Subadditivity because I couldn't quite figure out how to pronounce it and I got a little flustered trying to work it out ... and then my brain went - ohhhhhh, just say the word - Sub Sub Sub -and without warning the beat of Sussudio was pounding in my brain and now I have a headache.
o.O
posted by pjsky at 3:49 PM on January 24, 2017 [3 favorites]


Some people will call cognitive biases heuristics, which is less pejorative and captures the fact that there's a reason for their existence.
posted by lookoutbelow at 3:53 PM on January 24, 2017 [7 favorites]


Pony request: color code or tag each post by the cognitive bias category that it falls under.
posted by sammyo at 4:03 PM on January 24, 2017


Well we can't expect posters to correctly categorize cognitive categories consistently, now can we? So perhaps a project to use ML?
posted by sammyo at 4:09 PM on January 24, 2017


Can conscientious code cleanly categorize cognitive conundrums?
posted by sammyo at 4:13 PM on January 24, 2017 [2 favorites]


Our brains encounter situations where a response must be made with limited time and incomplete information. By necessity, they evolved shortcuts or heuristics that worked well-enough to keep them from getting killed. However, there are plenty of situations (especially as we have more complexity in modern life) in which these shortcuts do not work well.

The same heuristic that is the villain in one situation will be the hero in another situation.

These are not mental weaknesses or biases. Any cognitive system with limited resources relative to the information processing demands will have to use shortcuts.
posted by neutralmojo at 4:50 PM on January 24, 2017 [6 favorites]


Love this post. Thank you, thank you, author Benson and storybored! I've tried to read and wrap my head around the information in the Wikipedia article several times, and it just. won't. sink. in.

Now I'm off to print out the cheat sheet, wrap it around my brain, and fasten it with a rubber band.
posted by BlueHorse at 5:14 PM on January 24, 2017 [1 favorite]


one of my cognitive biases is that i have irrational hatred of textual information presented radially

funny, mine is against software engineers with thin-to-no backgrounds in psychology and philosophy who believe they can come to a solid understanding of human cognition by reading (sorry, "referencing") and organizing Wikipedia articles
posted by RogerB at 7:18 PM on January 24, 2017 [2 favorites]


It's interesting to think about what life would be like without each of these "biases." Almost all of them seem not only generally useful, but almost logically necessary for any intelligent entity operating in the real world.

Category 1 basically says we tend to weight more heavily the things we already know than the things we don't, or notice contrasts; 2 says we find patterns with sparse data or (as in 1) project what we know onto the world; 3 say we prefer simplicity and reduce complex probability distributions to more certain and simple distributions; and 4 says we simplify (as in 3) the data when digesting and remembering it.

All of these seem like totally reasonable and useful heuristics, useful even to the point of being more often right than wrong and to the point where I doubt whether one could generally characterize the conditions under which they are not applicable. Yeah, in some cases heuristics, as with everything, go wrong, but most of these seems like idiosyncratic cases carefully designed to go wrong, like optical illusions, rather than general flaws in reasoning. Yes, some people are (a lot) more wrong than others, but I doubt it is due to specific rampant biases so much as a general tendency to not error check via multiple empirical pathways (eg).

And it matters because these sorts of models of thought and reasoning suggest a very different approach to being more right than traditional education and critical thinking: instead, we have these dozens or hundreds of "hardwired" errors that each need to be patrolled and corrected through some sort of self-help-run-amok practice of memorization and self-policing. Kahneman and Tversky were great for bringing traditional psychological empirics into the study of microeconomics and rationality, but the entire program seems to have metastasized into the worst of psychology, with its endless series of amusing quirks and errors that portray the human brain as a congeries of preprogrammed modules rather than a more general-purpose heuristic thinking device with the inevitable and useful tendencies to simplify and project that are not just necessary for acting fast in a dangerous world, but in fact for doing anything useful at all.

To put it another way, which of these "biases" couldn't just as well describe Feynman at his most clever, or Google's Go-winning algorithms? In fact, even calling them "heuristics" seems to miss the point, since it implies some alternative, a machine learning algorithm that somehow grid searches the entire probability space that it somehow already knows. The implied alternative to "bias" seems like science fiction in the worst sense (speaking as an SF fan): a kind of magic computation that's not just practically impossible, but logically impossible. Weight all possible evidence equally with no regard for what you believe to already be right; consider all the data equally and never find false patterns; remember everything without regard to importance or what you already know; favor complexity over simplicity and keep indefinitely complex probability distributions always in mind; etc. These are the sorts of things I could imagine a (bad) SF writer saying a Vulcan always does, but the reason it's bad SF is not that we would never want to be such a person, but because it is logically impossible to create such an entity.

Again, that isn't to say we are homo economicus, perfectly reasoning things, or that there are no "hardwired" modules (whatever that means). And there's certainly something to be gained by consciously thinking about whether you have made type 1 or type 2 errors, or have over- or under-simplified, or have overlooked something, etc. But in the bigger picture, these supposed "biases" are mostly just the necessary conditions of necessary thinking practices, and trying to categorize and patch them up in this way is a bit like trying to build a better styrofoam boat by filling in all the holes in the styrofoam.
posted by chortly at 7:25 PM on January 24, 2017 [2 favorites]


...or that there are no "hardwired" modules (whatever that means)

Yeah, "hardwired" has become kind of a tell for me - the person awarding themselves knowledge of "hardwired" mental processes , in a speech or article, often has next to no actual credentials in neurology or other relevant cognitive sciences. Marketers and Ted talk types love "hardwired". It drizzles a pseudo-scientific glaze over any agenda, and simultaneously appeals to a lazy metaphor of the mind as being like a digital computer. Basically only people who have spent a hell of a lot of time in actual lab coats are allowed to tell me what is "hardwired" in human behaviour and the mind.
posted by thelonius at 8:38 PM on January 24, 2017 [3 favorites]


Kahneman and Tversky were great for bringing traditional psychological empirics into the study of microeconomics and rationality, but the entire program seems to have metastasized into the worst of psychology

It's doubly bad since this cheat sheet a) likely oversimplifies the conceptual relationships, issues, and taxonomies, and b) definitely is not peer-reviewed work. It's pop psychology versus psychology the body of scientific knowledge, or whatever.

We notice things that are already primed in memory or repeated often. This is the simple rule that our brains are more likely to notice things that are related to stuff that’s recently been loaded in memory.

This for example is so vague, and not even correct. If something is repeated often, you get desensitized. Human memory has no such "load" operation. "Priming" is a technical term in psychology and has a specific meaning. My confusion increased after reading this trying to make heads and tails of it.

I really like the idea of doing something with a huge list cognitive biases, but I feel like this delivery increases misunderstanding of the subject. It's a great ad, though.
posted by polymodus at 4:24 AM on January 25, 2017 [3 favorites]


I think cognitive biases are not nearly as helpful as they are made out to be. Sure, the brain needs ways to as quickly and efficiently deal with information and ambiguity as possible, but on the other had, it becomes an easy way out, an excuse, a bad habit. It's not like much encouragement is need for lazy thinking. Looking through the list of cognitive biases, it's hard not to see how many of them played/are playing a leading role in the US election debacle and the current insane political and media climate.

The Mind Hacks book by Tom Stafford does a much better job of showing actual ways the brain is "hardwired" for making best guesses and using (generally) optimum methods to both acquire and understand information and then present it as a seamless whole.
posted by blue shadows at 12:17 AM on January 26, 2017


Interesting! I always felt like this is how I learned... I can never get something without truly understanding it. But when I do understand something, it's in a "holistic" kind of way, which usually means I will get a huge level up no matter what it is. Same thing happened when I learned how to do street dance styles - I was overwhelmed by everything to know, and slowly became a very competitive dancer who understands it on a deep level.
posted by xplosiv at 7:16 PM on January 30, 2017


« Older The Spirit of Standing Rock and The Never-Ending...   |   Warm, warm, warm, cold as fuck Newer »


This thread has been archived and is closed to new comments