Jan 04 2018
Backfire Effect Not Significant
Previous research has shown that when confronted with a factual statement that appears to go against an ideologically held belief, a percentage of people tested will move their position away from the factual information – a so-called “backfire effect.” This notion was rapidly incorporated into the skeptical narrative, because it seems to confirm our perception that it is very difficult to change people’s minds.
However, more recent research suggests that the backfire effect may not exist, or at least is exceedingly rare. A recently published series of studies puts a pretty solid nail in the coffin of the backfire effect (although this probably won’t be the last word).
To be clear, people generally still engage in motivated reasoning when emotions are at stake. There is clear evidence that people filter the information they seek, notice, accept, and remember. Ideology also predicts how much people will respond to factual correction.
The backfire effect, however, is very specific. This occurs when people not only reject factual correction, but create counterarguments against the correction that move them further in the direction of the incorrect belief. It’s probably time for us to drop this from our narrative, or at least deemphasize it and put a huge asterisk next to any mention of it.

One critical question for the skeptical enterprise is the notion of a backfire effect – when someone is given factual information about a myth that they believe, do they update and correct their beliefs or do they dig in their heels and believe the myth even stronger? Some studies worryingly show that sometimes people dig in their heels, or they simply misremember the corrective information.
A recent commentary on Forbes advises:
“There is a cult of ignorance in the United States, and there has always been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.”
There is a common style of journalism, that you are almost certainly very familiar with, in which the report starts with a personal story, then delves into the facts at hand often with reference to the framing story and others like it, and returns at the end to the original personal connection. This format is so common it’s a cliche, and often the desire to connect the actual new information to an emotional story takes over the reporting and undermines the facts.
The question at the core of science communication and the skeptical movement is – how do we change opinions about science-related topics? That is the ultimate goal, not just to give information but to inform people, to change the way they think about things, to build information into a useful narrative that helps people understand the world and make optimal (or at least informed) decisions.
The human brain is plagued with cognitive biases – flaws in how we process information that cause our conclusions to deviate from the most accurate description of reality possible with available evidence. This should be obvious to anyone who interacts with other human beings, especially on hot topics such as politics or religion. You will see such biases at work if you peruse the comments to this blog, which is rather a tame corner of the social media world.
One thing I have learned as a science communicator over the last two decades, trying to digest many areas of science, is that stuff is complicated. It is a good rule of thumb that everything is more complicated than you might originally think.





