Cognitive biases are built-in, systematic patterns of thinking that we all have in common. They don’t only influence the way we see the world, but also sway most decisions we make every day – and, according to some estimates, we make a whopping 35,000 of them in a single day.
In some situations, we are able to identify these tendencies in our thought-processes and avoid falling into the pitfalls they make us vulnerable to. However, most of the time they work in such elusive ways that they end up flying under our radar – especially if you are not well familiar with them.
If we consider how many decisions we make each day and each year, it seems likely that being able to get more out of even a fraction of them would have a positive effect on one’s life. As such, taking some time to learn about cognitive biases and the ways they affect us can prove to be a worthwhile investment for anyone.
In this article, we list and describe 12 of these tendencies that are among the most common ones.
The confirmation bias is often seen as the mother of all biases, and with good reasons.
It describes the tendency to cherry-pick, recall, and interpret information so that it confirms our existing ideas, views, and values. That is, we are naturally inclined to ignore and dismiss information that doesn’t support the beliefs we already have. This effect is especially pronounced when the issue at hand is emotionally sensitive to us.
For example, if a scientist firmly beliefs that a particular theory is true, they may unknowingly favor new information in a way that reinforces their prior conclusions, while ignoring or rejecting evidence that is not consistent with the theory.
Why are we disposed to this type of biased thinking? A likely explanation is that it’s a more efficient way to process information. Evaluating new information in a thorough and impartial manner takes up a great deal of energy and would require us to frequently alter, or even completely change, our existing views about ourselves and the world around us.
As the investor Warren Buffett noted:
What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.
The Dunning-Kruger effect is another widely-known and commonly occurring bias. It refers to the predisposition of people with low ability or knowledge in a particular area to overestimate their capability in that area.
The reason why “incompetent” people tend to overestimate their skills is that they lack the needed self-awareness to recognize one’s own lack of ability.
This concept is named after two researchers, David Dunning and Justin Kruger, who identified the phenomenon as a cognitive bias in their 1999 paper. Their studies showed, for example, how participants who got the lowest scores in tests of grammar, logic, and sense of humor, consistently rated their own performances considerably higher than what the results revealed.
Those with limited knowledge in a domain suffer a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetenceDavid Dunning and Justin Kruger in their paper Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments (1999).
robs them of the ability to realize it.
Fundamental Attribution Error
This cognitive bias occurs when we overemphasize someone’s personal characteristics and ignore the influence of situational factors when judging their actions.
For instance, when you see someone driving recklessly on the road, or someone cuts you off in the parking lot, a typical reaction would be to assume their behavior is caused by a flaw in their personality and judge them accordingly (“what a selfish moron!”). We don’t tend to take into account the external reasons that might have caused their actions (for example, a medical emergency).
Curiously, we blame other people’s characteristics for their misbehavior, but when evaluating our own behavior, the converse is usually true; we look for situational factors to explain our own misbehavior (see self-serving bias below).
The fundamental attribution error seems to influence our judgment even when we are consciously aware that someone’s actions were not their own choosing. In an experiment conducted in 1967, the subjects were asked to read texts that were either for or against Fidel Castro. They were told that the viewpoints of the texts were predetermined; there was no evidence that they reflect the actual position of the writer. The results showed that, on average, the subjects still believed that the contents of the texts were in accordance with the opinions of the authors.
This effect describes how we tend to attribute positive outcomes to our own abilities, but blame external factors when things turn out wrong.
For example, if someone gets a good result on an exam, they naturally assume it’s due to some positive aspects of their character or because they had worked hard for the exam. In contrast, most of us can probably recall a case where we didn’t do so well on something and ended up laying the blame on our situation or other people.
The self-serving bias can have harmful consequences by not allowing one to recognize their shortcomings and make improvements where needed. However, it also has an effect that is useful to us: it helps protect our self-esteem. When we attribute negative events to factors outside of us we don’t have to feel bad about ourselves, but, on the other hand, we get a confidence boost when we take credit for our successes.
The bandwagon effect refers to the phenomenon whereby people do something simply based on the fact that a lot of other people are doing the same thing; whenever we come across a behavior, product, or belief that is popular, we feel encouraged to adopt or buy it too.
The name of this bias comes after the phrase “jump on the bandwagon”. Supposedly, in the past politicians used to ride a bandwagon through the streets in order to gain attention and win votes. People who supported a particular candidate would literally hop on their bandwagon, hence the phrase.
The bandwagon effect explains different fashion trends, fad diets, and fad products. It also explains why, for example, signs indicating that the majority of public transport users pay their fares is an effective strategy in reducing unpaid traveling.
One of the major reasons for this type of inclination is that we naturally want to avoid being excluded from the group. When we do what the rest of the people are doing, we gain social acceptance and approval.
This is a cognitive bias whereby we rely on information that is most available in our memory. In other words, we use the examples that we can most easily recall to make decisions and assess situations.
It explains why we tend to overestimate the likelihood of highly dramatic events, such as a plane crash. Such disasters are easily stored in our memories and often jump to the forefront of our thoughts, causing us to assume that they are more typical than they really are.
In a study examining the availability heuristic, the subjects were asked to estimate whether there are more words in the English language that start with the letter “k” or words in which “k” is the third letter. The results showed that most subjects believed words starting with “k” to be more common, which is incorrect; there are roughly three times more words with “k” as the third letter. The researchers concluded that it was easier for the subjects to think of words starting with “k”, causing them to overestimate how common they are compared to the second option. In other words, they made the judgment based on information that was more readily available to them, thus illustrating the availability heuristic.
Like other cognitive biases, this tendency has evolved to be a part of our cognition for a reason: it’s a mental shortcut to help us determine the risks of a decision quickly and effortlessly. Nevertheless, it can – and often do – make us rely on insufficient information and worsen our decision-making process.
Anchoring bias is the tendency to overly depend on the initial piece of information (the “anchor”) that is provided to us.
For example, it often plays a considerable role at the negotiation table where the first offer works as the anchor. Once the anchor is set, or the first offer made, we are biased towards interpreting information around it. Thus, the initial offer ends up having a huge impact on the final price.
We deploy anchoring frequently and in a wide variety of decision-making situations; whether we need to decide which product to buy, assess the level of our own happiness, mark the grades of our students, or estimate the value of a car, we typically do it by using a certain value or piece of information as an “anchor”. In some cases, this strategy can be harmful to us by causing us to pay too much for something, however, it is necessary for situations where we need to come up with a reasonably accurate estimate quickly.
In one experiment, participating college students were asked two questions: “How happy are you”, and “how often are you dating”. As it turned out, the order in which the questions were asked had a huge influence on how happy the students reportedly felt: when the question “how often are you dating” was asked first, the participants used their response as an anchor to answer the second question; they felt either more happy or sad depending on how often they said they were currently dating.
Suppose that you decide to invest your money in the stock market. You don’t know much about investing, so you choose a few stocks based on what your gut tells you. As it turns out, you double the size of your investment in the coming weeks. Does this mean that investing based on your gut instinct was a good decision?
If you said yes, you fell victim to the outcome bias – the tendency to evaluate past decisions by their ultimate outcomes. This reasoning is irrational because it doesn’t account for what information was available at the time the decision was being made, and what factors or events led to that result.
An infamous real-life example is the Japanese attack on the U.S. naval base in Pearl Harbor. After the attack had occurred, people blamed the U.S. army intelligence for failing to see the seemingly obvious attack and not evacuating the base. In reality, the event wasn’t as obvious and easy to predict as it may have appeared after it had already occurred; before the attack, there was plenty of contradictory evidence, some pointing towards it and some against it.
How can we overcome the outcome bias? Judge decisions solely based on how and why the decision was made before the result is known, and remember that a poor decision can lead to an excellent outcome while one that is carefully considered may well yield a negative result.
Survivorship bias is a common error in which people only pay attention to the successes while overlooking all the failures.
For example, if someone decides to drop out of school to start their own company because they’ve read about a couple of dropouts who became billionaires, they have fallen prey to this bias.
Wherever we look, we tend to see only the people who became successful: professional athletes, popular musicians, and successful entrepreneurs. What we rarely see are the ones that didn’t make it; for every success story, there are thousands of people who tried but didn’t make it past the selection process. Consequently, we get an inaccurate understanding of the reality of success and overestimate the odds of making it to the top.
One of the most well-known instances of the survivorship bias occurred during the second world war: A research group at the Columbia university were given a task to help the U.S. Army to figure out a way to better protect their military airplanes. After studying the damaged airplanes, the researchers instructed the army to reinforce those areas where the bullet holes were mostly concentrated. However, this approach was quickly abandoned after a statistician named Abraham Wald made a sobering realization: the only airplanes they were studying were the ones that made it back. Planes that never returned had sustained damage in the most vulnerable areas – such as the engine – and were completely missing from the pool of evidence.
If you have ever thought to yourself that you knew with certainty how a particular event was going to turn out after it had happened, there is a chance that your thinking was influenced by this mental shortcut.
The hindsight bias, also referred to as the “knew-it-all-along phenomenon”, describes how people tend to perceive past events as more predictable than they really were. As such, it leads us to believe that we knew how something was going to turn out even though we really didn’t before the events took place.
This one is similar to the outcome bias that was discussed earlier, but the difference is that the hindsight bias affects the way we view history due to a distortion in our memory.
In one experiment illustrating this, participants were asked to predict whether the U.S. Supreme Court nominee Clarence Thomas was going to be confirmed or not. Before the vote took place, 58% of the subjects said that Thomas was going to be confirmed, which turned out to be correct. Yet, when the researchers made a new survey after the outcome was known, a considerably higher number (78%) of the participants said they had predicted for a confirmation.
The halo effect occurs when a positive impression of someone or something affects our overall opinions of them.
This effect was first mentioned in 1920 in a paper by Edward Thorndike. In his studies, he found that people’s rating of one characteristic – whether positive or negative – influenced their rating of all other characteristics.
Also known as the “physical attractiveness stereotype”, this bias is prevalent in the world of social interactions. We form our overall judgment of others based on our first impression of them; if we meet someone new who gives us a positive impression – whether they are successful, attractive, or funny – we are likely to perceive them also as a kind, smart, and well-behaving person. In fact, it has been shown in several studies that good-looking individuals are
typically rated as more intelligent and tend to do better not only in their professional lives but also in educational settings.
This phenomenon is particularly eminent in the field of economics. It describes how we naturally fear experiencing loss more than we value gains. Simply put, losing $100 feels more painful to us – in fact twice as painful, according to Daniel Kahneman and Amos Tversky – than receiving the same amount of money.
Consequently, the loss aversion frequently influences our individual decision-making processes, especially regarding financial decisions. Even though it helps us avoid decisions that could end up in financial losses, it also prevents us from taking risks with the potential for reasonable returns.
This concept is also important to understand for people in marketing and advertising. According to loss aversion, showing your potential customers how your product will help them avoid something painful is likely to be a better approach than showing that they would gain something desirable. Furthermore, it explains why people react more strongly to an increase in prices than to a decrease.
- Cognitive Bias – Science Direct
- Judgment under Uncertainty: Heuristics and Biases (PDF)
- Confirmation Bias – Science Daily
- Hindsight Bias and Developing Theories of Mind – NCBI
- The Dunning-Kruger Effect Shows Why Some People Think They’re Great Even When Their Work Is Terrible – Forbes
- Availability: A Heuristic for Judging Frequency and Probability – PDF