Mind & Body

The Backfire Effect: Contradictory Facts Might Make Your Beliefs Stronger

It was Thomas Jefferson who said that an informed electorate is a prerequisite for democracy. But he assumed that people would change their minds when faced with facts and that their beliefs would proceed ever closer to the truth. It turns out that's not always the case.

We'd Rather Be Right Than Correct

In 2010, political scientists Brendan Nyhan and Jason Reifler had two groups of people read articles about how Iraq had weapons of mass destruction before the U.S. invasion. One group then read an article correcting that information: the 2004 Duelfer report, which confirmed that the country had no such weapons. Of conservatives who read only the first article, 34 percent believed that Iraq had weapons of mass destruction before the invasion. But of conservatives who read both, that number climbed to 64 percent. Contradictory information didn't change their beliefs; it actually strengthened them.

Why We Dig In Our Heels

Why is this? The backfire effect can be seen as the flipside of confirmation bias. Confirmation bias makes you seek out information that agrees with your preexisting beliefs. The backfire effect is what you do when information that doesn't agree with those beliefs finds you. In both cases, your mind protects you from the pain of being wrong, a state of inconsistency that psychologists call "cognitive dissonance."

As Thomas Gilovich wrote in his book "How We Know What Isn't So," we tend to frame new information in a way that agrees with whatever we already believe. There are multiple types of backfire effects:

  • The overkill backfire effect describes what happens when people stick to a simple rationale over a complex argument that is hard to process.
  • The familiarity backfire effect describes what happens when someone mistakenly remembers a falsehood as being true because they can't recall any of the specific arguments against it.

Related Video: Why You Can't Win an Argument Online

We Are Fact-Resistant, Not Fact-Immune

The Oxford University Press has had trouble replicating the results of Nyhan and Riefler's study with other ideological groups and seems to offer a more optimistic picture of how facts affect argument-making. Further research seems to indicate that while most people are unwilling to drop lifelong beliefs at the first appearance of a contrary fact, the trustworthiness of the source and the fact's realistic implications make a significant difference. Likewise, a review of the research by the UK nonprofit Full Fact found that many studies didn't find evidence of the backfire effect, but those that did found it most often with contentious topics and ambiguous claims.

According to Phillip Tetlock and Barbara Mellers, the most important factor for sidestepping cognitive bias is an outside view. When contradictory facts are presented from within the system under scrutiny — as they were in Nyhan and Riefler's study — it's easy to dismiss them as propaganda or "fake news." When those facts come from a dispassionate, trustworthy, external source, they are more likely to inspire curiosity, which is the first step towards gaining real knowledge.

Get stories like this one in your inbox or your headphones: Sign up for our daily email and subscribe to the Curiosity Daily podcast.

This article contains affiliate links. If you choose to make a purchase, Curiosity will get a share of the sale.

Written by Austin Jesse Mitchell March 29, 2019

Curiosity uses cookies to improve site performance, for analytics and for advertising. By continuing to use our site, you accept our use of cookies, our Privacy Policy and Terms of Use.