Jan 04 2018

Backfire Effect Not Significant

Previous research has shown that when confronted with a factual statement that appears to go against an ideologically held belief, a percentage of people tested will move their position away from the factual information – a so-called “backfire effect.” This notion was rapidly incorporated into the skeptical narrative, because it seems to confirm our perception that it is very difficult to change people’s minds.

However, more recent research suggests that the backfire effect may not exist, or at least is exceedingly rare. A recently published series of studies puts a pretty solid nail in the coffin of the backfire effect (although this probably won’t be the last word).

To be clear, people generally still engage in motivated reasoning when emotions are at stake. There is clear evidence that people filter the information they seek, notice, accept, and remember. Ideology also predicts how much people will respond to factual correction.

The backfire effect, however, is very specific. This occurs when people not only reject factual correction, but create counterarguments against the correction that move them further in the direction of the incorrect belief. It’s probably time for us to drop this from our narrative, or at least deemphasize it and put a huge asterisk next to any mention of it.

The new paper from Wood and Porter looked collectively at 10,100 subjects across 52 issues. The subjects were recruited online from the Amazon’s Mechanical Turk. Subjects were asked to indicate how much they agree with political statements, such as, “President
Obama has been more tolerant of illegal immigration than previous presidents.”

They were then either exposed or not exposed to a factual statement, such as, “In fact, according to the Department of Homeland Security, President Obama has deported
illegal immigrants at twice the rate of his predecessor, President George W Bush.”

They were again asked to indicate the extent of their agreement with the original statement. The study measured to what extend the factual correction changed their position. They found:

Among liberals, 85% of issues saw a significant factual response to correction, among moderates, 96% of issues, and among conservatives, 83% of issues. No backfire was observed for any issue, among any ideological cohort.

These studies indicate that liberals and conservatives are about the same in their response to factual information, with a generally strong response to factual correction, and neither group experienced a significant backfire effect. Moderates had the best response to factual correction, indicating perhaps a cognitive advantage to not having a strong ideological or partisan identity.

The authors discuss at length what all this means, including how to interpret these results considering prior research. First, we have to consider the study population. The original studies showing a backfire effect used graduate students, and may not reflect the general population. The current studies rely on mechanical turk recruits, and again may be a biased sample. However, the fifth study compared the results to a nationally representative sample, and found similar results.

For me there are two main limitations of this study – the first is that it is difficult to extrapolate from the artificial setting of a psychological study to an emotional discussion around the dinner table (or in the comments to a blog). It seems likely that people are much more willing to be reasonable in the former setting.

Second, we have no idea how persistent the correction effect is. People may immediately correct their belief, but then quickly forget the new information that runs counter to their narrative. That would be consistent with my personal experience, at least some of the time. It seems I can correct someone’s false information, with objective references, but then a month later they repeat their original claim as if the prior conversation never happened. I would love to see some long term follow up to these studies.

So if people do not respond to ideologically inconvenient facts by forming counterarguments and moving away from them (again – that is the backfire effect) then what do they do? The authors discuss a competing hypothesis, that people are fundamentally intellectually lazy. In fact, forming counterarguments is a lot of mental work that people will tend to avoid. It is much easier to just ignore the new facts.

Further there is evidence that to some extent people not only ignore facts, they may think that facts are not important. They may conclude that the specific fact they are being presented is not relevant to their ideological belief. Or they may believe that facts in general are not important.

What that generally means is that they dismiss facts as being biased and subjective. You have your facts, but I have my facts, and everyone is entitled to their opinion – meaning they get to choose which facts to believe.

Of course all of this is exacerbated by the echochamber effect. People overwhelmingly seek out sources of information that are in line with their ideology.

I think it is very important to recognize that the backfire effect is a small or perhaps even nonexistent phenomenon. The problem with belief in the backfire effect is that it portrays people as hopelessly biased, and suggests that attempts at educating people or changing their mind is fruitless. It suggests that the problem of incorrect beliefs is an unfixable inherent problem with human psychology.

Certainly there are psychological effects strongly at play when it comes to how people form their beliefs, but immunity to facts is not necessarily one of them. Rather, it seems that culture and behavior play a large role, and those are modifiable variables.

I do think it is important for people to generally recognize the negative effect that strong partisan identity and strong ideology has on their ability to reason. The ideal to which we should strive is the Bayesian approach – we evaluate all factual information in an unbiased manner and form our conclusions based on those facts, updating them as necessary. We deviate from the Bayesian ideal when motivated by emotion, identity, and even just laziness.

Further, we need to think about how we come by our information, because this can have a strong bias on what information we know and believe. If we passively go with the flow of our identity, we will tend to cocoon ourselves in a comfortable echochamber that will bathe us only in facts that have been curated for maximal ideological ease. This feedback loop will not only maintain our ideology but polarize it, making us more radical, and less reasonable.

Ideally, therefore, we should be emotionally aloof to any ideological identity, to any particular narrative or belief system. Further, we should seek out information based upon how reliable it is, rather than how much it confirms what we already believe or want to belief. In fact, to correct for this bias we should specifically seek out information that contradicts our current beliefs.

These are behaviors that anyone can cultivate. We are not destined to wallow in our existing narratives, immune to facts and logic. It does take a lot of work, however, and perhaps in the end this is the biggest barrier – simple intellectual laziness.

13 responses so far