Political Science

The Backfire Effect Says When You Hear Contradictory Evidence, Your Beliefs Get Stronger

News: The Curiosity Podcast is here! Subscribe on iTunes, Stitcher, Google Play Music, SoundCloud and RSS.

It was Thomas Jefferson who said that an informed electorate is a prerequisite for democracy. Recent research, however, finds that being informed may not be as beneficial as we think. In 2010, political scientists Brendan Nyhan and Jason Reifler had two groups of people read articles about how Iraq had weapons of mass destruction before the U.S. invasion. One group then read an article correcting that information: the 2004 Duelfer report, which confirmed that the country had no such weapons. Of conservatives who read only the first article, 34% believed that Iraq had weapons of mass destruction before the invasion. But of conservatives who read both, that number climbed to 64%. Contradictory information didn't change their beliefs; it actually strengthened them.

Advertisement

Why is this? The backfire effect can be seen as the flipside of confirmation bias. Confirmation bias makes you seek out information that agrees with your preexisting beliefs. The backfire effect is what you do when information that doesn't agree with those beliefs finds you. In both cases, your mind protects you from the pain of being wrong. As Thomas Gilovich wrote in his book How We Know What Isn't So, "For desired conclusions...it is as if we ask ourselves, 'Can I believe this?', but for unpalatable conclusions we ask 'Must I believe this?'" Learn more about your tendency for bias in the videos below.

Love getting smarter? Sign up to our newsletter and get our best content in your inbox!

Why Facts Won't Help You Win Arguments

People don't like to be wrong.

Share the knowledge!

Why You Can't Win Arguments Online

Things get even stickier in an electronic environment.

How To Avoid The Backfire Effect

Bayes' rule can help you avoid falling victim to your own bias.

Advertisement