We all like to see ourselves as rational beings, capable of thinking independently and reaching the most logical conclusions whenever needed. Unfortunately, the reality is quite different: our thinking is deeply influenced by a variety of mental flaws, known as cognitive biases.
Cognitive biases are built-in, systematic errors in thinking that virtually all human beings have in common. They influence the way we see the world, how we behave in social settings, what beliefs we adopt, and the process of our decision-making — and, more often than not, does it all unbeknownst to us.
In this article, we’ll explain in more detail how biases affect us, what you can do to combat them, as well as show several examples of them.
What Is a Cognitive Bias?
Cognitive biases — a term that was first introduced in the early 1970s by psychologists Amos Tversky and Daniel Kahneman — are innate patterns of thinking that lead to irrational behavior and decisions. They can stem either from our brain’s automatic functioning or various cognitive limitations that we are unable to overcome despite our deliberate attempts to think rationally.
They influence heavily how we perceive and interpret information in our environment and frequently sway our decision-making process. For instance, some ways they affect our thinking include:
- We process and recall information so that it confirms our existing ideas, views, and values.
- We overestimate our capabilities in areas where, in fact, we are not good at.
- We rely on information that is most available in our memory.
- We tend to perceive past events as more predictable than they really were.
- We fear a loss more than gaining something of equal value.
- We tend to judge decisions only by their outcomes.
Why Do We Have Biases?
Cognitive biases are largely products of evolution and a part of our cognition for a reason: they simplify information processing and allow us to make quick, survival-optimized judgments.
As such, they save us a great deal of energy by reducing the amount of deliberate effort needed to analyze situations; if we always had to consider all different possibilities and viewpoints when assessing new information, it would require one to frequently alter, or even completely change, their existing views about themselves and the world around them.
However, they are described to be psychological errors because, in today’s world, they cause thinking that harms our ability to thrive and succeed — even though they may have been effective in helping our ancestors to survive in a danger-ridden savannah.
A large number of cognitive biases are part of our brain’s intuitive and automatic functioning, or what is known as system 1 thinking. It is essential to our survival because it helps us identify possible threats and quickly steer away from them.
For instance, the reason you instinctively jump out of the harm’s when you see something approaching in the corner of your eye is due to system 1.
Problems arise with these types of biases when we rely too much on them in our decision-making. Since we are inclined to trust our intuition — especially if we sense that it has served us well in the past — we naturally tend to prefer it over logic and factual evidence.
In a complex world, the most effective or the correct solutions are often counter-intuitive and require us to deploy system 2 — that is, the rational but slow type of thinking. A good example of this is the following:
- A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?
The first guess that quickly comes to people’s minds is 10 cents. This is an example of how the fast and automatic system works. But, when they spend more time and effort thinking about the puzzle, they can usually reach the correct answer — which is 5 cents.
How to Combat Biases?
In discussions about biases, the question that naturally surfaces is, “how can we stop biases from messing up our thinking?”
Firstly, it is important to note that they cannot be completely eliminated; as mentioned earlier, they are largely products of evolution and a part of our brain’s cognitive system.
That being said, there are things we can do to reduce their impact. Multiple studies on the subject have shown that debias training can — albeit highly varying effectiveness — mitigate biases.
For example, one study on biases such as confirmation bias, anchoring, and social projection, found that a single training session, either in the form of a computer game or instructional video, can have a debiasing effect that reduces them up to 30% and lasts up to three months.
On the other hand, another study on optimism bias and personal risk perceptions found that training had no consistent effects in fighting off the tendency.
The general conclusion we can draw from all the debiasing studies is, at this point, that training can mitigate some of the biases and in some cases.
Despite these disconfirming findings, keep in mind that debiasing can still be a worthwhile effort for individuals. Considering that we make, according to some estimates, as many as 35,000 decisions each day, it seems highly possible that being able to make at least some of them — and ideally the more important ones — better would have a favorable effect on one’s life and work.
When it comes to debiasing, the most important step is to develop a good understanding of these cognitive errors. In fact, knowing when particular biases occur and being able to identify them in your own thinking may be enough to drastically reduce their effects.
For example, imagine a situation where most of your co-workers, being bitten by an investing bug, are putting money into the stock market. Each day, they mostly talk about investing and stocks. Naturally, you start to feel a strong urge to buy them too.
However, you are familiar with a cognitive bias called the bandwagon effect whereby we adopt behaviors and beliefs that a lot of other people have. You know that the urge to follow your fellow workers is due to this tendency, and you also know that it may lead you to a bad outcome.
As a result, rather than simply follow your instincts, you decide to think rationally and weigh the positive and negatives in the situation: If you do what others are doing, you’ll ensure social inclusion but you also stand to lose money on a venture that you know very little about.
Slow Down Your Decision-Making
In relation to the previous point, delaying your decision-making process is a powerful way to combat biases and improve the quality of your final decisions. The aim is to tap into system 2 thinking over the quick, intuitive system 1.
This requires a conscious choice to refrain from making certain commitments too hastily, and instead leave enough time to think logically through them.
For example, if a skilled salesman makes you really desire a certain product — in part by exploiting biases such as social proof and anchoring bias — and even offers you a one-time-only discount, you still refuse to make a purchase because you have a principle of not buying things impulsively. Rather, you wait until you are back home to ponder whether you really need the product or not.
This approach is especially critical with choices that have far-reaching implications, including any of the big life decisions most people have to make. Now, although people would rarely buy a house on the spot, they might take credit card debt, start (or end) a relationship, or accept a job offer without really considering all the risks and rewards involved. This can cause a myriad of unwanted consequences that could’ve potentially been avoided by putting more time and effort into the decision-making process.
Nudging is a concept in behavioral economics that was popularized by Richard Thaler and Cass Sunstein in their book Nudge: Improving Decisions about Health, Wealth, and Happiness (2008).
It refers to small, strategic adjustments in the environment that alter people’s behavior in a predictable way. It’s an easy and low-cost method that can be applied to many domains and purposes:
A nudge is a tweak in a decision-making environment intended to make people more likely to choose a particular action.
For instance, a sign indicating that the majority of public transport users pay their fares is a type of nudge; its purpose is to trigger the audience’s cognitive processes to favor the optimal action or result — which, in this case, is reducing unpaid traveling.
In one of the most popular examples, men’s “aim” in public toilets was improved by a surprisingly creative intervention: a fly painted on the bottom of the urinals. This simple nudge reduced spillage by as much as 80% and lead to lower clean-up costs.
Furthermore, another well-known study found that registering people automatically as organ donors and then asking if they wish to opt-out, rather than asking first if they are willing to opt-in, markedly increased the number of people registering for organ donations.
As anyone might guess, behavioral nudges are particularly prominent in the field of marketing. Whether you are shopping online or at the local grocery store, your choices are affected by them more frequently than you probably realize.
For example, most grocery stores are specifically designed to make customers take the desired action: colors are chosen based on their psychological effect, the most profitable products are placed at eye level, and certain sections are better lit to make you move in that direction.
Nudges abound in online stores too: displaying positive testimonials on the home page, a “frequently bought together”- section on product pages, or the original price next to the new discounted price, are all marketing tactics aimed at making you buy their products.
They are also found to be highly effective in promoting healthier habits.
Studies have proven, for instance, that the fruit intake of school students can be significantly increased by simple food labels, brightly colored advertisements, attractive-looking servings, and strategic placements of the fruits.
Similar tactics can also be applied at home. For example, one helpful nudge would be to place vegetables at eye level on the middle shelf of a fridge, rather than in the bottom drawer. The idea is that by making the items visible and easily accessible, the users are more likely to consume them.
Additionally, research has shown that the size of plates people use contributes to bad eating behavior; since people eat, on average, more than 90% of the food they put on their plates, larger plates mean people consume more calories and, therefore, stand to gain more weight.
As such, a relatively painless way to nudge people to eat less and potentially lose weight would be to switch from the typical-sized plates to slightly smaller ones.
When it comes to dental health, one study found that reminders can have a strong impact on the frequency of dental check-ups: patients who received a reminder to schedule a check-up were twice as likely to sign-up than ones who didn’t.
Here are six types of cognitive biases that frequently sway our choices.
Confirmation bias describes our tendency to interpret and recall information in a way that confirms our existing opinions and beliefs. In other words, we are naturally inclined to favor evidence that supports our prior conclusions and reject evidence that doesn’t.
For example, if a scientist firmly beliefs that their theory is correct, he or she may unknowingly favor new data that is in accordance with it while ignoring any information that would potentially disprove it.
This is a bias whereby we rely too heavily on the initial piece of information (the “anchor”) that is provided to us.
For instance, anchoring bias is a common occurrence during price negotiations where the first offer works as the anchor. Once the anchor is set, or the first offer made, we are biased towards interpreting information around it. Thus, the initial offer ends up having a huge impact on the final price.
Furthermore, it is present in many other situations beyond negotiations: whether we need to decide which product to buy, assess the level of our own happiness, or mark the grades of our students, we typically do it by using a certain value or piece of information as an “anchor”.
IKEA effect illustrates how we place a higher value on products that we helped to create.
The name comes after the large Swedish furniture manufacturer IKEA whose products usually come with some assembly required.
One well-known study goes as follows: Two groups of participants received IKEA boxes that either had to be assembled or came fully assembled. Then, the participants were asked to take part in an auction and bid for the boxes. Surprisingly, those individuals that build the boxes themselves were willing to pay 63% more for it than the non-builders, even though the objects looked identical.
This is a mental error whereby we judge decisions solely by the outcomes they lead to.
A simple example: Say that Andy is facing a difficult choice and, being unable to make his mind, he decides to flip a coin. And, as it turns out, he ends up with the better option, saving him from a world of trouble. Based on this, he infers that leaving hard choices to blind luck is a proper way to approach them — even though it clearly isn’t.
When it comes to decisions, what matters most is the process by which you reach them, not the final outcomes; in almost any situation, there are factors that are outside of our control and, therefore, the results will be determined by luck to a certain extent.
Moreover, if you are able to incorporate the unknown factors in your decision-making process, as well as take into account the risks they entail, then the decision you reach must be of high quality — regardless of the outcome.
This one refers to our tendency to take personal credit for positive outcomes while blaming external factors when things go wrong.
In a situation where someone, say, does well on a school exam, they naturally assume it’s due to some positive aspects of their character or because they had worked hard for the exam. In contrast, many of us can probably recall a case where we did worse than expected and ended up blaming our situation or other people for the result.
In the Dunning-Kruger effect, people with low ability in a particular area tend to overestimate their capabilities in that area.
In an experiment demonstrating this phenomenon, the subjects did a series of tests on grammar, logic, and sense of humor. The results showed clearly that those with the lowest scores consistently overestimated the quality of their own performances.