Mind & Body

Motivated Reasoning Is Why You Can't Win an Argument Using Facts

It's hard to win an argument these days. You can present scientific studies, historical evidence, and even pictures or videos to back up your words, but some people just won't budge from their position. And some of those people have some pretty strange ideas, from flat-earthers to the "Avril Lavigne Is Dead" conspiracy theorists (yes, that really is a thing). Have you ever wondered why people refuse to change their beliefs, no matter how outlandish, even when faced with cold, hard facts? Well, it turns out that "motivated reasoning" is to blame — and we're all guilty of it.

Related Video: How to Win an Argument on the Internet

Arguing With a Brick Wall

Psychologist Leon Festinger observed in 1956 that people are more likely to arrive at conclusions that they want to reach. This might sound obvious, but the lengths our brains will go to believe those conclusions are vast. When we want to believe something, we search for supporting evidence, and if we find even a single piece of pseudo-evidence, then we give ourselves permission to believe — a justification that lets us allow ourselves to stop thinking. This emotion-biased decisionmaking phenomenon is called "motivated reasoning," and it's based on the idea that emotions and motives trump facts and evidence. As social psychologist Jonathan Haidt once wrote, "the reasoning process is more like a lawyer defending a client than a judge or scientist seeking truth."

"If you're a motivated believer, then there's no way I can give you information to get you out of that belief," renowned skeptic and host of "The Skeptics' Guide to the Universe" Dr. Steven Novella told us on the Curiosity Podcast. "You have to, at some point, have some insight into your own psychology. Otherwise, you're going to use tools to reinforce the belief you wanted in the first place. You have to, at some point, confront the psychology of belief."

The power of motivated reasoning is hard to overstate. In a 1986 study, subjects who scored poorly on an IQ test later chose to read articles criticizing the validity of IQ tests, as opposed to articles supporting them. In a 1992 study involving a health test, participants who received an undesirable prognosis found more reasons why the test results might not be accurate versus other, healthier participants. Motivated reasoning can even influence what a person physically sees: Subjects in a 2006 study were more likely to interpret an ambiguous symbol on a screen as a letter rather than a number when they were given an incentive in advance to do so.

In Haidt's landmark book "The Righteous Mind: Why Good People Are Divided by Politics and Religion," he sums up why motivated reasoning can cause a headache for scientists in particular: "now that we all have access to search engines on our cell phones, we can call up a team of supportive scientists for almost any conclusion twenty-four hours a day. Whatever you want to believe about the causes of global warming or whether a fetus can feel pain, just Google your belief ... Science is a smorgasbord, and Google will guide you to the study that's right for you."

Oh, and do you think you're too smart to engage in motivated reasoning? Think again. "Ironically, smart people are better at it," Novella said. "If you're better educated, you have some knowledge of science and some knowledge of critical thinking. That just feeds your motivated reasoning."

A Skeptic's Guide to Motivated Reasoning

If even the smartest among us are guilty of motivated reasoning, and it's easier than ever to find evidence to support our beliefs, then how do we combat motivated reasoning? Various theories have sprung up around the web on how to argue with a motivated opponent, with the Socratic Method being a particularly compelling option. When it comes to saving yourself from this habit, scientific skepticism is a good place to start — but it's not easy.

"Being a skeptic means that you consciously prioritize having beliefs that are valid over beliefs that are not valid," Novella told us. "You have to care more about the process of how you go about evaluating beliefs than any particular conclusion. You have to relish being proven wrong as an opportunity to change your belief and to make it less wrong."

A good place to start is to be particularly critical of sources that support your beliefs. "I'm always the most suspicious of beliefs that I have or conclusions that I come to that are in line with my own ideology," Novella explained. "So if I have a particular worldview and something supports my worldview, then I have to be especially suspicious of it. Because that's when I'm going to be most vulnerable. That's when my motivated reasoning and confirmation bias are going to try hard to engage ... but that's exactly when you should question it the most. It's a high-energy state, and it takes a lot of vigilance and a lot of practice and a lot of dedication. It's a life-long practice, and there's no shortcut to that. You just have to really be dedicated to policing your own thinking."

In short: don't believe everything you read ... except for this article, of course!

You can hear our full conversation with Steven Novella on the Curiosity Podcast. Stream or download the episode using the player below, or find it everyone podcasts are found, including iTunes, Stitcher, and Gretta. You can also pick up his book: "The Skeptics' Guide to the Universe: How to Know What's Really Real in a World Increasingly Full of Fake." The audiobook is free with an Audible trial. If you choose to make a purchase through that link, Curiosity will get a share of the sale.

Written by Cody Gough September 25, 2017

Curiosity uses cookies to improve site performance, for analytics and for advertising. By continuing to use our site, you accept our use of cookies, our Privacy Policy and Terms of Use.