Aug 15 2017

More on the Backfire Effect

mythsOne critical question for the skeptical enterprise is the notion of a backfire effect – when someone is given factual information about a myth that they believe, do they update and correct their beliefs or do they dig in their heels and believe the myth even stronger? Some studies worryingly show that sometimes people dig in their heels, or they simply misremember the corrective information.

A new study sheds further light on this question, although it is not, by itself, the definitive final answer (one study rarely is).

For background, prior studies have show several effects of interest. First, from memory research we know that people store facts separate from the source of those facts and from the truth-status of those facts. That is why people will often say, “I heard somewhere that…” They may not even remember if the tidbit is true or not, but the idea itself is much easier to remember, especially if it is dramatic or resonates with some narrative or belief.

So, if you tell someone that there is no evidence linking vaccines and autism, they are most likely to remember something about a link between vaccines and autism, but not remember where the information came from or if the link is real. That, at least, is the concern.

The research on this topic is actually a bit complex because there are numerous variables. There are factors about the subjects themselves, their age, their baseline beliefs, their intentions, and the intensity of their beliefs. There are different types of information to give: positive information (vaccines are safe), dispelling negative information, graphic information, and fear-based information (pictures of sick unvaccinated kids). There are different topics – political vs scientific, with different levels of emotional attachment to the beliefs. There is belief vs intention – do you think vaccines are safe vs do you intend to vaccinate your kids? Finally there is time, immediate vs delayed effects.

I have written about this previously here. Rather than going through every study, let me just summarize the main findings. First, many people will update their beliefs when given corrective information. The backfire effect is not universal. People are more likely to update their beliefs with new information if they are younger, if they do not already have firm beliefs on the topic and they are not emotionally invested.

However, people don’t always update their beliefs simply because of flawed memory, even when ideology is not a factor. Not all information is equally “sticky.” Our narratives seem to have inertia, even if they are not that emotionally important. It takes work, apparently, to update your beliefs.

With regard specifically to vaccines, prior studies have found that people who are already anti-vaccine do not change their views with new information. In one study, in fact, people with strong antivaccine views incorporated corrective information somewhat, but despite this their intention to vaccinate was reduced. This effect did not replicate in one follow up study, however.

Here is what the new study found:

Our study provided further support to the growing literature showing how corrective information may have unexpected and even counter-productive results. Specifically, we found that the myths vs. facts format, at odds with its aims, induced stronger beliefs in the vaccine/autism link and in vaccines side effects over time, lending credit to the literature showing that countering false information in ways that repeat it may further contribute to its dissemination. Also the exposure to fear appeals through images of sick children led to more increased misperceptions about vaccines causing autism. Moreover, this corrective strategy induced the strongest beliefs in vaccines side effects, highlighting the negative consequences of using loss-framed messages and fear appeals to promote preventive health behaviours. Our findings also suggest that no corrective strategy was useful in enhancing vaccination intention. Compared to the other techniques, the usage of fact/icon boxes resulted in less damage but did not bring any effective result.

So, no strategy worked (myth vs fact, graphic information, or fear-based information) and the first and third strategy had a delayed backfire effect. The limitations of this study is that they looked at belief and did not measure intention, and it was limited to college-age subjects. Still, this is a sobering result.

Conclusion

Researchers are still sorting out all the complex variables regarding belief in misinformation and how to correct it. I suspect that the research so far is discouraging because the bottom line is that you cannot have a significant positive impact on people with one intervention. It takes time, multiple interactions, and addressing underlying supporting beliefs and narratives.

One ad campaign is not going to change the conversation on vaccines, for example. The best we can do is get information out there in a way that minimizes or avoids any significant backfire effect, and hope the information is useful long term.

I do think all of this reinforces the basic recommendation of promoters of science and critical thinking – we need to significantly improve our basic education in both. The population needs to be more scientifically literate and more critical in its approach to claims and information. You cannot fix deep deficits in this regard with one information campaign.

 

11 responses so far