Political Science

The Backfire Effect: Contradictory Facts Make Your Beliefs Get Stronger

It was Thomas Jefferson who said that an informed electorate is a prerequisite for democracy. But he assumed that people would change their minds when faced with facts, and that their beliefs would proceed ever-closer to the truth. It turns out they don't.

We'd Rather Be Right Than Correct

In 2010, political scientists Brendan Nyhan and Jason Reifler had two groups of people read articles about how Iraq had weapons of mass destruction before the U.S. invasion. One group then read an article correcting that information: the 2004 Duelfer report, which confirmed that the country had no such weapons. Of conservatives who read only the first article, 34% believed that Iraq had weapons of mass destruction before the invasion. But of conservatives who read both, that number climbed to 64%. Contradictory information didn't change their beliefs; it actually strengthened them.

Why We Dig In Our Heels

Why is this? The backfire effect can be seen as the flipside of confirmation bias. Confirmation bias makes you seek out information that agrees with your preexisting beliefs. The backfire effect is what you do when information that doesn't agree with those beliefs finds you. In both cases, your mind protects you from the pain of being wrong, a state of inconsistency that psychologists call "cognitive dissonance".

As Thomas Gilovich wrote in his book How We Know What Isn't So, we tend to frame new information in a way that agrees with whatever we already believe. There are multiple types of backfire effects: The overkill backfire effect describes what happens when people stick to a simple rationale over a complex argument that is hard to process. The familiarity backfire effect describes what happens when someone mistakenly remembers a falsehood as being true because they can't recall any of the specific arguments against it.

We Are Fact-Resistant, not Fact-Immune

The Oxford University Press has had trouble replicating the results of Nyhan and Riefler's study with other ideological groups, and seems to offer a more optimistic picture of how facts affect argument-making. Further research seems to indicate that while most people are unwilling to drop lifelong beliefs at the first appearance of a contrary fact, the trustworthiness of the source and the fact's realistic implications make a major difference.

According to Phillip Tetlock and Barbara Mellers, the most important factor for sidestepping cognitive bias is an outside view. When contradictory facts are presented from within the system under scrutiny – as in Nyhan and Riefler's study – it's easy to dismiss them as being propaganda or "fake news". When those facts come from a dispassionate, trustworthy, external source, they are more likely to inspire curiosity, which is the first step towards gaining real knowledge.

Why You Can't Win An Argument Online

Written by Austin Jesse Mitchell October 13, 2016

Curiosity uses cookies to improve site performance, for analytics and for advertising. By continuing to use our site, you accept our use of cookies, our Privacy Policy and Terms of Use.