Jan 04 2018

Backfire Effect Not Significant

Previous research has shown that when confronted with a factual statement that appears to go against an ideologically held belief, a percentage of people tested will move their position away from the factual information – a so-called “backfire effect.” This notion was rapidly incorporated into the skeptical narrative, because it seems to confirm our perception that it is very difficult to change people’s minds.

However, more recent research suggests that the backfire effect may not exist, or at least is exceedingly rare. A recently published series of studies puts a pretty solid nail in the coffin of the backfire effect (although this probably won’t be the last word).

To be clear, people generally still engage in motivated reasoning when emotions are at stake. There is clear evidence that people filter the information they seek, notice, accept, and remember. Ideology also predicts how much people will respond to factual correction.

The backfire effect, however, is very specific. This occurs when people not only reject factual correction, but create counterarguments against the correction that move them further in the direction of the incorrect belief. It’s probably time for us to drop this from our narrative, or at least deemphasize it and put a huge asterisk next to any mention of it.

The new paper from Wood and Porter looked collectively at 10,100 subjects across 52 issues. The subjects were recruited online from the Amazon’s Mechanical Turk. Subjects were asked to indicate how much they agree with political statements, such as, “President
Obama has been more tolerant of illegal immigration than previous presidents.”

They were then either exposed or not exposed to a factual statement, such as, “In fact, according to the Department of Homeland Security, President Obama has deported
illegal immigrants at twice the rate of his predecessor, President George W Bush.”

They were again asked to indicate the extent of their agreement with the original statement. The study measured to what extend the factual correction changed their position. They found:

Among liberals, 85% of issues saw a significant factual response to correction, among moderates, 96% of issues, and among conservatives, 83% of issues. No backfire was observed for any issue, among any ideological cohort.

These studies indicate that liberals and conservatives are about the same in their response to factual information, with a generally strong response to factual correction, and neither group experienced a significant backfire effect. Moderates had the best response to factual correction, indicating perhaps a cognitive advantage to not having a strong ideological or partisan identity.

The authors discuss at length what all this means, including how to interpret these results considering prior research. First, we have to consider the study population. The original studies showing a backfire effect used graduate students, and may not reflect the general population. The current studies rely on mechanical turk recruits, and again may be a biased sample. However, the fifth study compared the results to a nationally representative sample, and found similar results.

For me there are two main limitations of this study – the first is that it is difficult to extrapolate from the artificial setting of a psychological study to an emotional discussion around the dinner table (or in the comments to a blog). It seems likely that people are much more willing to be reasonable in the former setting.

Second, we have no idea how persistent the correction effect is. People may immediately correct their belief, but then quickly forget the new information that runs counter to their narrative. That would be consistent with my personal experience, at least some of the time. It seems I can correct someone’s false information, with objective references, but then a month later they repeat their original claim as if the prior conversation never happened. I would love to see some long term follow up to these studies.

So if people do not respond to ideologically inconvenient facts by forming counterarguments and moving away from them (again – that is the backfire effect) then what do they do? The authors discuss a competing hypothesis, that people are fundamentally intellectually lazy. In fact, forming counterarguments is a lot of mental work that people will tend to avoid. It is much easier to just ignore the new facts.

Further there is evidence that to some extent people not only ignore facts, they may think that facts are not important. They may conclude that the specific fact they are being presented is not relevant to their ideological belief. Or they may believe that facts in general are not important.

What that generally means is that they dismiss facts as being biased and subjective. You have your facts, but I have my facts, and everyone is entitled to their opinion – meaning they get to choose which facts to believe.

Of course all of this is exacerbated by the echochamber effect. People overwhelmingly seek out sources of information that are in line with their ideology.

I think it is very important to recognize that the backfire effect is a small or perhaps even nonexistent phenomenon. The problem with belief in the backfire effect is that it portrays people as hopelessly biased, and suggests that attempts at educating people or changing their mind is fruitless. It suggests that the problem of incorrect beliefs is an unfixable inherent problem with human psychology.

Certainly there are psychological effects strongly at play when it comes to how people form their beliefs, but immunity to facts is not necessarily one of them. Rather, it seems that culture and behavior play a large role, and those are modifiable variables.

I do think it is important for people to generally recognize the negative effect that strong partisan identity and strong ideology has on their ability to reason. The ideal to which we should strive is the Bayesian approach – we evaluate all factual information in an unbiased manner and form our conclusions based on those facts, updating them as necessary. We deviate from the Bayesian ideal when motivated by emotion, identity, and even just laziness.

Further, we need to think about how we come by our information, because this can have a strong bias on what information we know and believe. If we passively go with the flow of our identity, we will tend to cocoon ourselves in a comfortable echochamber that will bathe us only in facts that have been curated for maximal ideological ease. This feedback loop will not only maintain our ideology but polarize it, making us more radical, and less reasonable.

Ideally, therefore, we should be emotionally aloof to any ideological identity, to any particular narrative or belief system. Further, we should seek out information based upon how reliable it is, rather than how much it confirms what we already believe or want to belief. In fact, to correct for this bias we should specifically seek out information that contradicts our current beliefs.

These are behaviors that anyone can cultivate. We are not destined to wallow in our existing narratives, immune to facts and logic. It does take a lot of work, however, and perhaps in the end this is the biggest barrier – simple intellectual laziness.

13 responses so far

13 thoughts on “Backfire Effect Not Significant”

  1. Kabbor says:

    Surely there must be a backfire effect. After reading this article I’m more certain than ever that the backfire effect is real!

    Sorry, someone had to do it.

    Interesting article, I hope that it is true, but as you said the long term takeaway is what matters when it comes to changing minds. It certainly seems from the unsinkable rubber duckies that at least some people are more than willing to disregard factual information when it interferes with their narrative at least some of the time.

  2. Mick West says:

    The primary problem with this study is that it is only measuring the IMMEDIATE effect of corrections. As they say in the final sentence of the discussion, there’s little backfire effect to correcting ideologically biased misinformation “at least for a brief moment”. It tells use nothing about what might happen weeks or months later. In fact the design of the study seems more like a reading comprehension test than about measuring changes in belief.

    I’d recommend people have a look at the overview of backfire effects in The Debunking Handbook by Cook & Lewandowsky (free online). They identify three types: Familiarity Backfire, Overkill Backfire, and Worldview Backfire. Worldview backfire (which the Wood & Porter study measures) is more manifest as a disconfirmation bias, something which Wood and Porter dismiss, but don’t measure – not because people are too lazy to come up with alternative explanations, but because the immediate nature of the study does not allow the participants time for any mental gymnastics. The other two forms of backfire are likewise things that happen over time.

    So I’d not put too large an asterisk on the backfire effect just yet.

  3. googolplexbyte says:

    Didn’t leave enough wiggle room for a backfire effect.

    It very hard to come up with a counter-argument that would fit their beliefs with the facts.

    If the post-correction question had been slightly different like:

    Indicate how much you agree with political statements, such as, “President
    Obama has been more tolerant of illegal MEXICAN immigration than previous presidents.”

    That way you could compare changes in agreement between the general and specific statement, with and without the correction.

  4. FuzzyMarmot says:

    Here is a great, longer read on the recent research on the backfire effect:

    https://slate.com/health-and-science/2018/01/weve-been-told-were-living-in-a-post-truth-age-dont-believe-it.html

  5. B.S. says:

    I think that the backfire effect is most likely an emotional response. I’m reading “Crucial Conversations” right now and this book describes emotional responses to uncomfortable conversations- attacking someone who disagrees with you (perceived as an adversary) and defending yourself without thinking are a huge portion of this book. This model seems to fits both anecdotal observations of the backfire effect and this new research.
    The mechanical turn questions appear to be emotionless and have no cues from an opponent with an opposing view. The corrections were all “neutral data from [cited] governmental sources.”. I’d bet that changing the factual correction to “No it isn’t you asshole! President Obama has deported illegal immigrants at twice the rate of Bush!” (note no source cited, because we rarely remember them in conversations) would elicit some sort of backfire effect that would likely be even larger if delivered emotionally and in person by an “adversary”. Maybe this all means that the key to eliminating any backfire effect is removing emotion from your response and accurately citing neutral sources. Maybe this means that dispassionate real-time fact checking of politicians could actually make a difference. Regardless, this is an interesting addition to the literature and conversation. It restores some of my hope.

  6. Gojira74 says:

    I would have thought intellectual laziness was more likely than active recoil. I see a lot of “ignoring facts” via red herrings (Well, yeah but OBAMA was a Muslim). I rarely see someone pick up what I said and then use it push themselves farther away. Anecdotes aren’t evidence, but it’s nice when the evidence agrees with my personal observations (it’s nice when it disagrees as well because I’d hate to sit around being wrong).

    While it can FEEL pointless to “argue facts on the internet,” if one chooses the right opponents (IE people who are at least vaguely reasonable and knowledgeable), those challenges can help refine ones own beliefs. Sometimes, an ideologically motivated assumption can be buried, or it can sit comfortably along side facts until such time as it is challenged directly. Having someone disagree with you, no matter how confident you think you are, can bear fruit for you, regardless of what THEY do with their lives. If you can’t explain clearly why you think certain things, then you haven’t thought about them enough IMHO.

  7. BillyJoe7 says:

    “The authors discuss a competing hypothesis, that people are fundamentally intellectually lazy”

    This is in line with the opinion of Daniel Kahnman as expressed in his book “Thinking fast and slow”. The main characteristic of his (cognitive) system 2 is that it is LAZY. It mostly just goes along with whatever the (intuitive) system 1 suggests.

    “People overwhelmingly seek out sources of information that are in line with their ideology”

    As opposed to seeking out sources of reliable information, meaning sources that are free of ideological bias and based on scientifically derived facts and opinions, and conclusions based on those facts.

    “The problem with belief in the backfire effect is that it…suggests that attempts at educating people or changing their mind is fruitless”

    Which is obviously untrue. For example, I would think that the majority of commenters on this blog have come from previously entrenched ideological positions or positions far removed from being based on scientifically derived facts and conclusions.

    “The ideal to which we should strive is the Bayesian approach – we evaluate all factual information in an unbiased manner and form our conclusions based on those facts, updating them as necessary”

    I would add that, where our knowledge and experience, and our ability to evaluate the information are limited, we should seek out and provisionally trust the consensus conclusions of unbiased experts in that field of knowledge, including the consensus of what CANNOT be concluded from the information (i.e. pseudoscientific conclusions and wild speculations extrapolating far beyond the available evidence); and where there is no consensus, to accept that we just do not know.

    “we should specifically seek out information that contradicts our current beliefs”

    However, exploring “alternatives” can prove to be an enormous time waster. We certainly need to pick and choose what “information that contradicts our current beliefs” we actually “seek out”. For example, I don’t think I can be bothered anymore with exploring creationism, teleology, thomism, intelligent universe, or most of what passes a climate scepticism.

  8. BillyJoe7 says:

    Speaking of time restraints, has anyone actually had the time to read that 68 page research paper on the backfire effect?
    I have to admit I’ve read only sections 1,9,10,11 (7 pages), and skimmed the rest.

  9. FuzzyMarmot says:

    So, for some reason the moderation doesn’t want to let me post a link from Slate about the backfire effect. I highly recommend people check out an article by Daniel Engber published yesterday, called “LOL something matters”. It is a long read, but worth it, and really puts the research in context.

  10. NiroZ says:

    I’d wager that the reason for this would be in line with the research for motivational interviewing (a therapy technique) as well as the research around stigma, shame and vulnerability. Basically, when people make arguments that appear to be part of the ‘backfire’ effect, they’re actually responding to the feeling of being cornered, the loss of control and power in find found incorrect and the possible sense of alienation they feel about identifying with an ‘incorrect’ belief. If this is correct, it’s likely that these people would, under the right circumstances/ to people they feel safe with, admit that X belief is wrong, but they need adhere to it for other reasons (to belong in a group, to annoy someone they dislike, to avoid losing face).

  11. Nidwin says:

    From my experience the backfire effect kicks in when folks can’t say “woops, was I wrong on that one”.

    Folks only change their minds as long as the subject doesn’t breech their little personal cocoon. And even then it’s often FIFO (first in first out).

  12. jayarava says:

    There is growing evidence that sceptics are too credulous when it comes to results that justify their disdain for religious belief. Being all to eager to believe in the “backfire effect” is just another example of this bias. Cultivating and defending bias is anti-science.

    My own views on this issue where completely changed by reading Mercier and Sperber’s book “The Enigma of Reason” in 2017. And before that by Justin L Barrett’s book “Why Would Anyone Believe in God?” I also think of Damasio and his characterisation of how emotions help us decide the *salience* of information to our decision making process (via the ventromedial prefrontal-cortex).

    We are a long way from understanding rationality or belief and hampered by a load of unhelpful legacy ideas because they appeal to the biases of militant atheists.

  13. BillyJoe7 says:

    FM: “I highly recommend people check out an article by Daniel Engber”

    Here is the link:

    https://slate.com/health-and-science/2018/01/weve-been-told-were-living-in-a-post-truth-age-dont-believe-it.html

    The article IS worth reading, longish as it is.
    It provides the history of opinions and studies for and against “The Backfire Effect”.
    It also references studies that suggest that “The Echo-chamber Effect” may not be real.

Leave a Reply