58 Cognitive Biases That Are Screwing Up Everything You Do

Shutterstock

We like to think we're rational human beings.

In fact, we are prone to hundreds of proven biases that cause us to think and act irrationally. In fact, even thinking we're rational despite evidence of irrationality in others is known as blind-spot bias.

The study of how often human beings do irrational things was enough for psychologist Daniel Kahneman to win the Nobel Prize in Economics, and it opened the rapidly expanding field of behavioral economics. Similar insights are also reshaping everything from marketing to criminology.

Hoping to clue you — and ourselves — into the biases that frame our decisions, we've collected a long list of the most notable ones.

This is an update of an article that was previously published with additional contributions by Drake Baer and Gus Lubin.

The affect heuristic describes how humans sometimes make decisions based on emotion.

The psychologist Paul Slovic coined this term to describe the way people let their emotions color their beliefs about the world. For example, your political affiliation often determines which arguments you find persuasive.

Our emotions also affect the way we perceive the risks and benefits of different activities. For example, people tend to dread developing cancer, so they see activities related to cancer as much more dangerous than those linked to less dreaded forms of death, illness, and injury, such as accidents.

Anchoring bias means people rely too heavily on the first piece of information they hear when making decisions.

People are over-reliant on the first piece of information they hear.

In a salary negotiation, for instance, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind. Any counteroffer will naturally react to or be anchored by that opening offer.

"Most people come with the very strong belief they should never make an opening offer," said Leigh Thompson, a professor at Northwestern University's Kellogg School of Management. "Our research and lots of corroborating research shows that's completely backwards. The guy or gal who makes a first offer is better off."

Availability heuristic describes a shortcut where people make decisions based on information that's easier to remember.

In one experiment, a professor asked students to list either two or 10 ways to improve his class. Students that had to come up with 10 ways gave the class much higher ratings, likely because they had a harder time thinking about what was wrong with the class.

This phenomenon could easily apply in the case of job interviews. If you have a hard time recalling what a candidate did wrong during an interview, you'll likely rate him higher than if you can recall those things easily.

The bandwagon effect describes when people do something simply because others are also doing it.

The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink — and it's a reason meetings are often so unproductive.

Bias blind spots describes how individuals can see bias in others, but struggle to see their own biases.

Failing to recognize your cognitive biases is a bias in itself.

Notably, Princeton psychologist Emily Pronin has found that "individuals see the existence and operation of cognitive and motivational biases much more in others than in themselves."

Flickr / Tristan Bowersox

Choice-supportive bias describes the tendency to have positive attitudes about the things or ideas we choose, even when they are flawed.

When you choose something, you tend to feel positive about it, even if the choice has flaws. You think that your dog is awesome — even if it bites people every once in a while — and that other dogs are stupid, since they're not yours.

The clustering illusion happens when we see trends in random events that happen close together.

This is the tendency to see patterns in random events. It is central to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.

Confirmation bias describes the tendency to only listen to information that confirms our preconceptions.

We tend to listen only to the information that confirms our preconceptions. Once you've formed an initial opinion about someone, it's hard to change your mind.

For example, researchers had participants watch a video of a student taking an academic test. Some participants were told that the student came from a high socioeconomic background; others were told the student came from a low socioeconomic background. Those in the first condition believed the student's performance was above grade level, while those in the second condition believed the student's performance was below.

If you know some information about a job candidate's background, you might be inclined to use that information to make false judgments about his or her ability.

Full Article
Comments

If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.