Everything You Need to Know About Confirmation Bias

Helen Lee Bouygues

I often see confirmation bias in action, from sports to investing to politics. Not long ago, for instance, a colleague at a private equity firm looked through a pack of resumes. He’d met one of the job candidates, and they hit it off. But instead of weighing the evidence around the individual’s skills and experience, my colleague simply looked for information that confirmed his belief that this candidate was the right one for the job.

My colleague saw that the potential hire had worked in private equity, for example, and so he argued that the person had a rich knowledge of financial reporting, despite the fact that hire did not have actual experience in financial reporting. My colleague also argued that the potential hire must know strategy since the potential hire had worked for a consulting firm, but the candidate had not specialized in strategy and did not know the topic well. 

My colleague was suffering from confirmation bias, one of the most common cognitive biases. As researchers like Daniel Kahneman have detailed, people are likely to zero in on evidence that confirms what they already think, and ignore, or fail to seek out, evidence that might refute their view. Think of confirmation bias as the “know-it-all bias.” Instead of looking at the evidence in a thoughtful manner, we act as if we already “know” the answers. 

Confirmation bias affects decision-making of all kinds, from whom you choose to vote for to what you pick to eat off the menu at a restaurant. But what exactly is it? Where does it come from? And what can be done to lessen its influence?

Confirmation Bias Definition

Confirmation bias at its simplest is a kind of hypothesis testing that searches disproportionately for evidence that confirms the hypothesis over evidence that might disconfirm it.

In the initial experiment that led to the development of the concept, psychologist Peter Wason gave subjects a group of three numbers, 2-4-6, and asked them to identify a rule the sequence was following by testing their own groups of numbers. The rule was extremely simple: three increasing numbers. But he found that subjects would fixate on guessing sequences that confirmed their initial hypothesis. 

So, if they thought, as is natural, that that the pattern was increasing even numbers, they might guess 8-10-12 or 18-20-22. But they would keep getting the rule wrong because their instinct was too specific.  The problem was that they were only trying to confirm their initial view. If they simply made guesses that went against their intuition — 1-2-3 or 6-4-2 — they would have quickly hit upon the rule.

Confirmation bias, simply put, is a preference for seeking or attending to information that is consistent with currently held beliefs or hypotheses.

 

Where Does Confirmation Bias Come From?

Confirmation bias is related to what’s called a “positive test strategy,” a cognitive tool or “heuristic” that can be very useful. The important thing to recognize is that there is nothing inherently wrong with positive testing. In many contexts it works. When a good deal of information points in one direction, it is often productive to simply accept it or look for more evidence in the same vein. 

But, in other contexts, like that of Wason’s experiment or when gathering information online, confirmation bias leads people to, knowingly or not, create conditions where they only expose themselves to evidence pointing one direction, even though there is plenty of contradictory evidence out there.

It’s also important to flag that emotions play a role. We’re more likely to rely on mental shortcuts when our emotions are triggered. For example, imagine you’re watching the Super Bowl and your team goes ahead after an amazing touchdown pass to the back of the endzone. But wait. The opposing coach challenges the play, and the referees go to the replay booth. As the replay plays over and over again on the TV, you fixate on how firmly the receiver is holding the ball as he comes down to the ground. “He’s got it. He’s got it,” you tell anyone within earshot. 

Meanwhile, you fail to pay any attention to the fact that it looks like his foot might be, ever so slightly, out of bounds. This is our emotional investment — our group identity and pride in our team — kicking up our confirmation bias.In contexts like this confirmation bias is relatively harmless and even unavoidable. In other contexts, though, the desire for intuitive shortcuts can lead us astray. Confirmation bias is not something that can be completely eliminated. It is the product of an intuitive shortcut in our thinking that has positive uses, but often leads to error.  The solution is not to completely overhaul or distrust the way you think, but to be aware of the errors our minds are prone to and take steps to mitigate their negative effects. 

Confirmation Bias Examples

Examples of confirmation bias are everywhere, from our politics and news consumption to our personal preferences and social lives. Here are a few examples to give you a sense of how often the problem of confirmation bias crops up:

Confirmation bias in science and medicine. Even trained scientists aren’t immune to confirmation bias. Once a hypothesis is made, there can be a lot of incentives to have it confirmed. Doctors can likewise be driven to try to confirm an original diagnosis and end up neglecting symptoms that suggest a better diagnosis. There’s a popular saying in medicine that reflects this problem: “When the diagnosis is made, the thinking stops.”

One study, for example, gave psychiatrists and med students a range of initial symptoms that suggested a diagnosis of depression: the hypothetical patient described himself as being unusually sad and appeared to be self-medicating. If the doctors requested the right additional information, however, they would likely come to the accurate diagnosis, dementia. 

Even though almost all the participants chose an initial diagnosis of depression, most were able to conduct a broad enough information search to uncover the right diagnosis. Still, a significant minority of around 20 percent, many of them students and less experienced psychiatrists, showed confirmation bias, asking for only additional information that would confirm the depression diagnosis. Of those, predictably, only 30 percent came to the correct diagnosis. 

The study shows both the pull of confirmation bias, and the hope for overcoming it, since training and experience seemed to make a difference. 

Personal judgments and the halo effect. Research has shown that when people see someone positively in one aspect, they’re likely to judge the person positively in other ways too, even without good evidence. This is called the halo effect. The reverse is true too: if people judge someone negatively in one regard, they’re likely to look for and find other ways the person can be regarded as negative too. 

For example, research in the 1920s found that military commanders misjudged the technical skill and intelligence of the soldiers underneath them when they were first asked to rank their physical qualities.  That is, taller or better-looking soldiers were deemed more intelligent and more effective, and smaller or less attractive seeming ones were regarded as less intelligent and effective than they objectively were. 

The halo effect is thus a subspecies of confirmation bias: we expect all aspects of a person to follow an initial trend. It helps explain why first impressions can be so hard to shake off, even in the face of plenty of later counterevidence. 

Confirmation bias in news and politics. Confirmation bias can affect both the quality of news itself and news consumers. Low-quality journalism or outright fake news can be produced, intentionally or unintentionally, by writers who have a fixed view about the world in their heads that they take with them when they go to investigate a story. This leads them to cherry-pick stories and write slanted headlines that conform to their audience’s preferred narrative.

News consumers, meanwhile, may follow only hyper-partisan sources that run stories confirming previously held views. With social media, increasingly partisan cable news stations, and other online outlets, it has become much easier to build a personal “filter bubble” where you only see news and opinion that conforms to previously held beliefs.

The COVID-19 pandemic has provided ample evidence of this kind of confirmation bias. For example, those worried about the political fallout of the crisis could downplay it by focusing on single anecdotes — “my 85 year-old grandfather got COVID and was fine” — or isolated statistics that conform to their preferred narrative. 

Under conditions like this, news consumers aren’t making informed judgements about politics or societal issues, they are just driving themselves toward emotionally-held dogmatic beliefs: a kind of  “confirmation bias on steroids.” Some studies have even suggested that these partisans aren’t even using the parts of their brains associated with reasoning when they consume political content.

How to Resist Confirmation Bias

The sciences offer a good example of how confirmation bias can be resisted. The scientific method, including randomized control trials as well as anonymous peer-review are designed to mitigate the effects of confirmation and other related biases. The mechanisms routinely force scientists to question their assumptions, remove their own prejudices from the experimental process, and bring in outside viewpoints to confirm or disconfirm their results.

Obviously, the scientific method can’t — and shouldn’t — be applied to everyday, or even political, judgments, but there are pointers the example of science can be instructive. And teaching critical thinking in science courses can be a great way to introduce biases to young people.

Be explicit about your biases. Simply being honest with yourself about your prior beliefs can go a very long way. When people evaluate new evidence they always have prior beliefs going in. There’s nothing wrong with that, but if they take stock of what those beliefs are, they will be better able to recognize when they’re only looking at information that confirms those beliefs, or when they’re interpreting new evidence in skewed ways.

Ask yourself, “What if I’m wrong?” One important method in science is to run tests that would disconfirm a hypothesis. You can do the same in our everyday lives by just assuming, for the sake of argument, that you’re wrong. Once you take that perspective, look for the information that would prove you wrong. 

You probably won’t change your mind, but you will have more information. You’ll be on a lot firmer ground for holding your beliefs, you’ll have more confidence in them, and you’ll be better able to argue for them. Perhaps mostly importantly, you’ll be more understanding of those who think differently and more tolerant of ambiguity.

Bring in an outside view. Anonymous peer review means that the academics reviewing scientific papers come to the experimental evidence with a fresh set of eyes and are not even aware who conducted the research. This removes the evaluation process as much as possible from any preconceptions the reviewer might have about the paper or its author. 

Everyday life doesn’t usually allow us to go to those lengths, but you can ask others for their opinion on a decision you have to make or a view you feel convinced by. You can also take steps not to prejudice them by avoiding saying what you think first. 

This provides us with yet more information and evidence free of our own entrenched perspective and potential confirmation biases. Taking steps like this not only prevents us from making mistakes; it can help open our minds to the world and perspectives around us, and, ultimately, enrich our lives.

Helen Lee Bouygues is the president of the Reboot Foundation

Newsletter Subscription

Subscribe to our newsletter and stay updated on the latest research an information about critical thinking!