You Can’t Always Trust Your Own Thoughts, And This Terrifying Chart Shows Why
Here’s the newest guide for learning about cognitive biases and debugging your brain.
The human brain has been sculpted over the course of millions of years of evolution, so you’d think it’s been debugged pretty well.
But it’s still left with so many mental glitches.
Just take a look at this terrifying atlas of all the mental traps that plague our everyday thinking. (Click here for an enlarged version.)
Buster Benson, a product manager at Slack, used the massive information provided by thousands of Wikipedians over the years to make an easy-to-navigate list of cognitive biases. His post on Medium got a lot of attention from readers, one of whom was John Manoogian, an internet entrepreneur and engineer. Manoogian, amazed by the amount of information that was there, decided to turn the list into a visual map.
“You look at this overwhelming array of cognitive biases and distortions, and realize how there are so many things that come between us and objective reality,” Manoogian told The Huffington Post. “One of the most overwhelming things to me that came out of this project is humility.”
Manoogian has set up a print-on-demand system where people can order a poster version of the atlas. Since its creation two weeks ago, the response has been great, he said.
“We’ve gotten orders from Australia, Israel, all over Europe. A few people asked for translated versions,” Manoogian said. “And a lot of people who are ordering the poster are professors and instructors working in psychology and cognitive science fields.”
Let’s refresh our memories about what cognitive biases are. Humans tend to think of themselves as totally rational thinkers and sound decision-makers, but we often show evidence to the contrary. Our brain is programmed to take shortcuts, and as a result, produces systematic patterns of illogical thinking and behaviors. These flaws are called cognitive biases, and as multiple studies have shown, they can even bypass well-established rules and lead to disasters ― from the loss of lives in Mount Everest expeditions to the global financial crisis in 2008.
Some of these biases are very commonly seen in everyday life. The bandwagon effect, for example, makes people adhere to an idea or vote for someone merely based on the high number of other supporters. The blind spot bias causes us to see logical flaws in others’ thinking more often than we do in our own. The confirmation bias makes us more attuned to evidence that supports our own view, a problem plaguing many political and social landscapes.
And there are at least 175 more types of bugs in the ways we think. The ostrich effect may compel you to ignore this concerning piece of information by “burying” your head in the sand like an ostrich and closing this page.
For those who are still here, though, the good news is that Benson has done all of us a huge favor by categorizing these biases into 20 buckets, making them a lot easier to find and study depending on the type of problem you are dealing with.
“I’ve taken some time over the last four weeks (I’m on paternity leave) to try to more deeply absorb and understand this list, and to try to come up with a simpler, clearer organizing structure to hang these biases off of,” Benson wrote in his blog post. “Reading deeply about various biases has given my brain something to chew on while I bounce little Louie to sleep.”
After breaking down the biases into buckets based on their similarities, Benson decided to categorize them on an even higher level.
“I made several different attempts … and eventually landed on grouping them by the general mental problem that they were attempting to address,” Benson wrote. “If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.”
Benson ended up with four types of problems: problems involving information overload; lack of meaning; the need to act fast; and how to know what needs to be remembered for later.
Luckily, despite hardwired shortcuts in our brains, we do have the ability to become aware of cognitive biases, which is the first step if we are to learn to fix them. And Benson’s reformatted list and Manoogian’s visual version may help as a study guide.
Still, that’s something that may take years to learn.
“A common reaction is how can I know all these, but I think it would be difficult to be cognizant of all these things at once,” Manoogian said. “What it does for me is that it puts an additional level of thoughtfulness when doing things. You realize that your brain is adapted to solve all these basic level survival problems and now that we are trying to do higher order things, the brain has its own ideas.”