I often see confirmation bias in action, from sports to investing to politics. Not long ago, for instance, a colleague at a private equity firm looked through a pack of resumes. He’d met one of the job candidates, and they hit it off. But instead of weighing the evidence around the individual’s skills and experience, my colleague simply looked for information that confirmed his belief that this candidate was the right one for the job.
My colleague saw that the potential hire had worked in private equity, for example, and so he argued that the person had a rich knowledge of financial reporting, despite the fact that hire did not have actual experience in financial reporting. My colleague also argued that the potential hire must know strategy since the potential hire had worked for a consulting firm, but the candidate had not specialized in strategy and did not know the topic well.
Confirmation bias affects decision-making of all kinds, from whom you choose to vote for to what you pick to eat off the menu at a restaurant. But what exactly is it? Where does it come from? And what can be done to lessen its influence?
Confirmation bias at its simplest is a kind of hypothesis testing that searches disproportionately for evidence that confirms the hypothesis over evidence that might disconfirm it.
So, if they thought, as is natural, that that the pattern was increasing even numbers, they might guess 8-10-12 or 18-20-22. But they would keep getting the rule wrong because their instinct was too specific. The problem was that they were only trying to confirm their initial view. If they simply made guesses that went against their intuition — 1-2-3 or 6-4-2 — they would have quickly hit upon the rule.
But, in other contexts, like that of Wason’s experiment or when gathering information online, confirmation bias leads people to, knowingly or not, create conditions where they only expose themselves to evidence pointing one direction, even though there is plenty of contradictory evidence out there.
It’s also important to flag that emotions play a role. We’re more likely to rely on mental shortcuts when our emotions are triggered. For example, imagine you’re watching the Super Bowl and your team goes ahead after an amazing touchdown pass to the back of the endzone. But wait. The opposing coach challenges the play, and the referees go to the replay booth. As the replay plays over and over again on the TV, you fixate on how firmly the receiver is holding the ball as he comes down to the ground. “He’s got it. He’s got it,” you tell anyone within earshot.
Meanwhile, you fail to pay any attention to the fact that it looks like his foot might be, ever so slightly, out of bounds. This is our emotional investment — our group identity and pride in our team — kicking up our confirmation bias.In contexts like this confirmation bias is relatively harmless and even unavoidable. In other contexts, though, the desire for intuitive shortcuts can lead us astray. Confirmation bias is not something that can be completely eliminated. It is the product of an intuitive shortcut in our thinking that has positive uses, but often leads to error. The solution is not to completely overhaul or distrust the way you think, but to be aware of the errors our minds are prone to and take steps to mitigate their negative effects.
Examples of confirmation bias are everywhere, from our politics and news consumption to our personal preferences and social lives. Here are a few examples to give you a sense of how often the problem of confirmation bias crops up:
Even though almost all the participants chose an initial diagnosis of depression, most were able to conduct a broad enough information search to uncover the right diagnosis. Still, a significant minority of around 20 percent, many of them students and less experienced psychiatrists, showed confirmation bias, asking for only additional information that would confirm the depression diagnosis. Of those, predictably, only 30 percent came to the correct diagnosis.
The study shows both the pull of confirmation bias, and the hope for overcoming it, since training and experience seemed to make a difference.
For example, research in the 1920s found that military commanders misjudged the technical skill and intelligence of the soldiers underneath them when they were first asked to rank their physical qualities. That is, taller or better-looking soldiers were deemed more intelligent and more effective, and smaller or less attractive seeming ones were regarded as less intelligent and effective than they objectively were.
The halo effect is thus a subspecies of confirmation bias: we expect all aspects of a person to follow an initial trend. It helps explain why first impressions can be so hard to shake off, even in the face of plenty of later counterevidence.
The COVID-19 pandemic has provided ample evidence of this kind of confirmation bias. For example, those worried about the political fallout of the crisis could downplay it by focusing on single anecdotes — “my 85 year-old grandfather got COVID and was fine” — or isolated statistics that conform to their preferred narrative.
The sciences offer a good example of how confirmation bias can be resisted. The scientific method, including randomized control trials as well as anonymous peer-review are designed to mitigate the effects of confirmation and other related biases. The mechanisms routinely force scientists to question their assumptions, remove their own prejudices from the experimental process, and bring in outside viewpoints to confirm or disconfirm their results.
Obviously, the scientific method can’t — and shouldn’t — be applied to everyday, or even political, judgments, but there are pointers the example of science can be instructive. And teaching critical thinking in science courses can be a great way to introduce biases to young people.
Be explicit about your biases. Simply being honest with yourself about your prior beliefs can go a very long way. When people evaluate new evidence they always have prior beliefs going in. There’s nothing wrong with that, but if they take stock of what those beliefs are, they will be better able to recognize when they’re only looking at information that confirms those beliefs, or when they’re interpreting new evidence in skewed ways.
Ask yourself, “What if I’m wrong?” One important method in science is to run tests that would disconfirm a hypothesis. You can do the same in our everyday lives by just assuming, for the sake of argument, that you’re wrong. Once you take that perspective, look for the information that would prove you wrong.
You probably won’t change your mind, but you will have more information. You’ll be on a lot firmer ground for holding your beliefs, you’ll have more confidence in them, and you’ll be better able to argue for them. Perhaps mostly importantly, you’ll be more understanding of those who think differently and more tolerant of ambiguity.
Bring in an outside view. Anonymous peer review means that the academics reviewing scientific papers come to the experimental evidence with a fresh set of eyes and are not even aware who conducted the research. This removes the evaluation process as much as possible from any preconceptions the reviewer might have about the paper or its author.
Everyday life doesn’t usually allow us to go to those lengths, but you can ask others for their opinion on a decision you have to make or a view you feel convinced by. You can also take steps not to prejudice them by avoiding saying what you think first.
This provides us with yet more information and evidence free of our own entrenched perspective and potential confirmation biases. Taking steps like this not only prevents us from making mistakes; it can help open our minds to the world and perspectives around us, and, ultimately, enrich our lives.