“Nothing is easier than self-deceit. What a man wishes, he also believes.” – Demosthenes
As the ancient Greeks found 2500 years ago, have a tendency to cherry-pick information that confirms our existing beliefs or ideas. We call this tendency confirmation bias. It’s a deeply ingrained mental habit that’s energy-conserving and comfortable.
Imagine our ancestors hunting. One day an angry animal charges at them and kills a few tribe members. The next time this happens, this past experience tells them to run: there is no time to consider alternatives.
While most of us no longer have to hunt for food, we still have situations in which we have to process a lot of information and make quick decisions – when driving a car, for example. To handle this we take the path of least resistance: look for evidence that confirms our existing beliefs and hypotheses rather than disconfirming evidence that forces us to form new explanations. Evaluating evidence (especially when it is complicated or unclear) requires a great deal of mental energy, after all.
Confirmation bias also protects our self-esteem. Nobody likes to feel bad about themselves. Realizing that a belief we value is false can have this effect. This effect is strengthened with deeply-held views or ideologies that often form our identity – disproving these hurts even more.
This way, confirmation bias explains why two people with opposing views on a topic can see the same evidence and come away feeling validated by it.
“The human understanding when it has once adopted an opinion … draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects.” – Francis Bacon
How confirmation bias works and affects our thinking
Confirmation bias works in 3 ways:
Biased search for information: looking for evidence that confirms one’s belief, while not looking for disconfirming evidence.
Biased interpretation of information: the same two individuals can have the same information and interpret it completely different based on existing beliefs. Earlier evidence forms the foundation of a belief: the order in which one is exposed to something matters.
This preference for early information explains why (poor) first impressions are difficult to overcome.
Biased memory recall of information: people may remember facts selectively to reinforce their expectations, despite searching for or interpreting these facts in a neutral manner.
To return to our hunting ancestors: since the first angry animal they encountered killed some of their tribe members, they may interpret every angry animal as out to kill them. Over time, this can turn into the belief that that particular animal is dangerous, regardless of its emotional state, and needs to be killed before it kills us.
As a consequence of confirmation bias, our hunting ancestors:
- Forgot that the first angry animal charged at them because it was provoked – they had killed its child or cub.
- Misinterpreted subsequent angry animals as aggressive and out to kill them…instead of realizing they were scaring off the hunters to protect themselves, with no intent to kill.
- Stopped searching for information why the animal killed the first group of hunters or what made it angry…and simply went with “angry animals kill” and “every animal is angry.”
John Vaillant’s The Tiger – in which man fought against man-eating tiger in Russia’s Far East in the 1990s – illustrates this exact case of confirmation bias.
Confirmation bias examples in real life
Confirmation bias appears in all areas of life, both on an individual level as well as on a group and societal level.
Investing
Once we’ve found an investment we feel good about, we start to ignore evidence telling us that our strategy will lose money. When our investment goes down, we react with “the market is wrong” or “it will correct itself,” rejecting the notion our original hypothesis may be wrong.
This effect of confirmation bias gets exacerbated if this particular investment makes us feel part of a group. We tie our identity to this group and investment, and want to remain consistent with this self-image to detriment of rational thinking. A great example of this is so-called “maximalists,” in, for example, the cryptocurrency space.
Internet
As a result of algorithms, we’re more likely to be exposed to the “filter bubble effect”: technology amplifying and facilitating our cognitive tendency toward confirmation bias.
Platforms provide us more of what we like and what aligns with our beliefs, values and ideologies, isolating us in a bubble, away from contrary views. This exposes us to even more confirming evidence, further reinforcing our beliefs. As a result, we can fall prey to fake news.
Business
If we hold the belief that “hard work equals success,” we’ll likely attribute any success or failure to hard work (or the lack thereof) and ignore other factors that play a role.
For example, if a new restaurant location is underperforming, we’ll consider that the result of our team not working hard enough. Anytime we see a colleague taking a break will reinforce this idea.
This leaves us vulnerable to an alternative, and perhaps more likely, explanation: the restaurant’s poor location.
Recruitment
As a result of confirmation bias, it’s difficult to overcome first impressions.
Someone who graduated from Harvard will have an easier time convincing a recruiter than someone who graduated from a small community college: the former is assigned more positive labels than the latter without having met.
Similarly, we assign traits or values based on our first impression: how the person looks, how the person dresses, how the person presents him- or herself.
How to avoid confirmation bias
Confirmation bias is so deeply ingrained that it cannot be eliminated. The good news: it can be managed.
First comes awareness of this bias. Next are critical thinking and environment:
- Argue both sides. This forces you to seek out disconfirming evidence and remain flexible. “I never allow myself to hold an opinion on anything that I don’t know the other side’s argument better than they do.” – Charlie Munger
- Scientific method. Actively seek disconfirming evidence.
- Occam’s Razor. Is this the simplest explanation?
- Hanlon’s Razor. Can this be explained by stupidity, rather than malice?
- Expose yourself to contradictory sources. Exposure breeds harmony.
- Surround yourself with those who hold the opposite view. You’ll gravitate towards the group’s default view to remain consistent with group identity.
- Avoid extreme views one way or the other. Seek moderation in media sources and friends.
Thought experiments and asking yourself (the right) questions are similarly helpful:
- Which parts did I automatically agree with?
- Which parts did I ignore or skim over without realizing?
- How did I react to the points which I agreed or disagreed with?
- Did this information confirm any ideas I already had? Why?
- What if I thought the opposite of those ideas?
Research also shows that acting like you’re giving a friend advice can be useful, so examine your belief from a third-person perspective.
“Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.” – Elizabeth Kolbert, The New Yorker
Constantly evaluating our worldview or holding different ideas in our head is exhausting, so we prefer to reinforce existing beliefs instead. It’s much easier and comes more natural, but makes us vulnerable to faulty judgments and poor decision-making.
Avoiding confirmation altogether is unlikely. Managing it, however, is possible. A good starting point is seeking out disconfirming evidence and exposing yourself to a variety of viewpoints.