In an uncertain world, people love to be right because it helps us make sense of things. Indeed some psychologists think it’s akin to a basic drive.
One of the ways they strive to be correct is by looking for evidence that confirms they are correct, sometimes with depressing or comic results:
- A woman hires a worker that turns out to be incompetent. She doesn’t notice that everyone else is doing his work for him because she is so impressed that he shows up every day, right on time.
- A sports fan who believes his team is the best only seems to remember the matches they won and none of the embarrassing defeats to inferior opponents.
- A man who loves the country life, but has to move to the city for a new job, ignores the flight-path he lives under and noisy-neighbours-from-hell and tells you how much he enjoys the farmer’s market and tending his window box.
We do it automatically, usually without realising. We do it partly because it’s easier to see where new pieces fit into the picture-puzzle we are working on, rather than imagining a new picture. It also helps shore up our vision of ourselves as accurate, right-thinking, consistent people who know what’s what.
Psychologists call it the confirmation bias and it creeps into all sorts of areas of our lives. Here are a few examples:
“Hey, you look great, have you done something different with your hair?”
Who doesn’t like a compliment? No one. It doesn’t even have to be sincerely delivered, I’ll take it. But what about…
“Hey, you’re a real slime-ball, you know that?”
Who likes insults? Well, we don’t exactly like them but—believe it or not—sometimes we seek them out if they confirm our view of ourselves.
In a study that examined this, people actually sought out information confirming their own view that they were—not exactly slime-balls—but lazy, or slow-witted or not very athletic (Swann et al., 1989).
And this isn’t some kind of self-hating thing; in this study even people with high self-esteem sought out information that confirmed their own negative self-views.
It seems we like to be right, even at a cost to our self-image.
A study of online stock market investors has looked at how they gathered information about a prospective stock (Park et al., 2010).
The researchers found the confirmation bias writ large. Investors mostly looked for information that confirmed their hunch about a particular stock. Those people who displayed the strongest confirmation bias were the most over-confident investors and consequently made the least money.
It seems we like to be right, even if it costs us money.
People see what they want to see in politics all the time.
The most ironic example is in satire. Often satire uses sarcasm to make its point: TV satirist Stephen Colbert frequently says the opposite of what he really thinks to make his point (amongst comedians I believe these are called ‘jokes’).
Except the irony is that one study has shown that people who don’t agree with Colbert don’t get that he’s being sarcastic, they think he really means it (LaMarre, 2009).
The beauty is that both liberals and conservatives get what they want: their viewpoints confirmed.
It seems we like to be right, even if it means not getting the joke.
Blame it on the bias
Over the years the confirmation bias has picked up the blame for all sorts of dodgy beliefs. Here are a few:
- People are prejudiced (partly) because they only notice facts which fit with their preconceived notions about other nations or ethnicities.
- People believe weird stuff about flying saucers, the JFK assassination, astrology, Egyptian pyramids and the moon landings because they only look for confirmation not dis-confirmation.
- In the early nineteenth century doctors treated any old disease with blood-letting. Their patients sometimes got better so doctors—who conveniently ignored all the people who died—figured it must be doing something. In fact for many ailments some people will always get better on their own without any treatment at all.
Fight the bias
The way to fight the confirmation bias is simple to state but hard to put into practice.
You have to try and think up and test out alternative hypothesis. Sounds easy, but it’s not in our nature. It’s no fun thinking about why we might be misguided or have been misinformed. It takes a bit of effort.
It’s distasteful reading a book which challenges our political beliefs, or considering criticisms of our favourite film or, even, accepting how different people choose to live their lives.
Trying to be just a little bit more open is part of the challenge that the confirmation bias sets us. Can we entertain those doubts for just a little longer? Can we even let the facts sway us and perform that most fantastical of feats: changing our minds?
About the author
Psychologist, Jeremy Dean, PhD is the founder and author of PsyBlog. He holds a doctorate in psychology from University College London and two other advanced degrees in psychology.
He has been writing about scientific research on PsyBlog since 2004. He is also the author of the book “Making Habits, Breaking Habits” (Da Capo, 2013) and several ebooks:
- Accept Yourself: How to feel a profound sense of warmth and self-compassion
- The Anxiety Plan: 42 Strategies For Worry, Phobias, OCD and Panic
- Spark: 17 Steps That Will Boost Your Motivation For Anything
- Activate: How To Find Joy Again By Changing What You Do
Image credit: mat
→ This post is part of a series on cognitive biases:
- The Dunning-Kruger Effect: Why The Incompetent Don’t Know They’re Incompetent
- The Worse-Than-Average Effect: When You’re Better Than You Think
- Why You’re a Sucker for the Impact Bias
- The Hindsight Bias: I Knew It All Along!
- How to Overcome the Egocentric Bias
- See How Easily You Can Avoid The Memory Bias
- 4 Belief Biases That Can Reduce Pleasure
- Does Delaying Decisions Lead to Better Outcomes?
- The Belief in a Just World: A Fundamental Delusion
- Why Society Doesn’t Change: The System Justification Bias
- The Availability Bias: Why People Buy Lottery Tickets
- The Illusion of Transparency
- The Illusion of Control: Are There Benefits to Being Self-Deluded?
- The Endowment Effect: Why It’s Easy to Overvalue Your Stuff
- Illusory Correlations: When The Mind Makes Connections That Don’t Exist
- Anchoring Effect: How The Mind is Biased by First Impressions
- The Confirmation Bias: Why It’s Hard to Change Your Mind
- The Well-Travelled Road Effect: Why Familiar Routes Fly By
- How a Psychological Bias Makes Groups Feel Good About Themselves And Discredit Others
- The Sobering Up Effect: Why People Get More Pessimistic As The Moment of Truth Gets Closer