Aug 15 2017

More on the Backfire Effect

mythsOne critical question for the skeptical enterprise is the notion of a backfire effect – when someone is given factual information about a myth that they believe, do they update and correct their beliefs or do they dig in their heels and believe the myth even stronger? Some studies worryingly show that sometimes people dig in their heels, or they simply misremember the corrective information.

A new study sheds further light on this question, although it is not, by itself, the definitive final answer (one study rarely is).

For background, prior studies have show several effects of interest. First, from memory research we know that people store facts separate from the source of those facts and from the truth-status of those facts. That is why people will often say, “I heard somewhere that…” They may not even remember if the tidbit is true or not, but the idea itself is much easier to remember, especially if it is dramatic or resonates with some narrative or belief.

So, if you tell someone that there is no evidence linking vaccines and autism, they are most likely to remember something about a link between vaccines and autism, but not remember where the information came from or if the link is real. That, at least, is the concern.

The research on this topic is actually a bit complex because there are numerous variables. There are factors about the subjects themselves, their age, their baseline beliefs, their intentions, and the intensity of their beliefs. There are different types of information to give: positive information (vaccines are safe), dispelling negative information, graphic information, and fear-based information (pictures of sick unvaccinated kids). There are different topics – political vs scientific, with different levels of emotional attachment to the beliefs. There is belief vs intention – do you think vaccines are safe vs do you intend to vaccinate your kids? Finally there is time, immediate vs delayed effects.

I have written about this previously here. Rather than going through every study, let me just summarize the main findings. First, many people will update their beliefs when given corrective information. The backfire effect is not universal. People are more likely to update their beliefs with new information if they are younger, if they do not already have firm beliefs on the topic and they are not emotionally invested.

However, people don’t always update their beliefs simply because of flawed memory, even when ideology is not a factor. Not all information is equally “sticky.” Our narratives seem to have inertia, even if they are not that emotionally important. It takes work, apparently, to update your beliefs.

With regard specifically to vaccines, prior studies have found that people who are already anti-vaccine do not change their views with new information. In one study, in fact, people with strong antivaccine views incorporated corrective information somewhat, but despite this their intention to vaccinate was reduced. This effect did not replicate in one follow up study, however.

Here is what the new study found:

Our study provided further support to the growing literature showing how corrective information may have unexpected and even counter-productive results. Specifically, we found that the myths vs. facts format, at odds with its aims, induced stronger beliefs in the vaccine/autism link and in vaccines side effects over time, lending credit to the literature showing that countering false information in ways that repeat it may further contribute to its dissemination. Also the exposure to fear appeals through images of sick children led to more increased misperceptions about vaccines causing autism. Moreover, this corrective strategy induced the strongest beliefs in vaccines side effects, highlighting the negative consequences of using loss-framed messages and fear appeals to promote preventive health behaviours. Our findings also suggest that no corrective strategy was useful in enhancing vaccination intention. Compared to the other techniques, the usage of fact/icon boxes resulted in less damage but did not bring any effective result.

So, no strategy worked (myth vs fact, graphic information, or fear-based information) and the first and third strategy had a delayed backfire effect. The limitations of this study is that they looked at belief and did not measure intention, and it was limited to college-age subjects. Still, this is a sobering result.

Conclusion

Researchers are still sorting out all the complex variables regarding belief in misinformation and how to correct it. I suspect that the research so far is discouraging because the bottom line is that you cannot have a significant positive impact on people with one intervention. It takes time, multiple interactions, and addressing underlying supporting beliefs and narratives.

One ad campaign is not going to change the conversation on vaccines, for example. The best we can do is get information out there in a way that minimizes or avoids any significant backfire effect, and hope the information is useful long term.

I do think all of this reinforces the basic recommendation of promoters of science and critical thinking – we need to significantly improve our basic education in both. The population needs to be more scientifically literate and more critical in its approach to claims and information. You cannot fix deep deficits in this regard with one information campaign.

 

11 responses so far

11 Responses to “More on the Backfire Effect”

  1. MWSlettenon 15 Aug 2017 at 9:42 am

    In my experience, the approach taken when discussing controversial topics has as much–in some cases more–effect as the actual facts themselves. Some of us are absolutely thrilled to be proven wrong, to learn something new. Others, not so much. I’ve found that directly confronting someone who lacks appreciation for learning new things by calling their beliefs “myths” will pretty much ensure they won’t hear anything else you have to say. Long, involved arguments based on facts gleaned from “studies” opens you up to nitpicking: Who conducted the studies and why? Who paid for the studies? Etc., etc.

    Instead, I usually say something like, “Are you sure? I believe I read such and such. I may be wrong, but this makes more logical sense to me, considering this and that.” It’s easy to dismiss nameless “scientists” who conduct studies on behalf of shadowy corporations out to make a buck. It’s harder when you’re just talking with someone you know–and hopefully respect.

    The intent is to motivate a person to seek the facts on their own instead of smacking them in the face with them.

  2. SteveAon 15 Aug 2017 at 10:09 am

    The delayed effect was measured after 7 days.

    I wonder if the results would be the same measured after months of regular exposure. It takes people time to get used to new ideas. Most advertising relies on a drip-feed approach and I don’t see why, ideally, this shouldn’t also apply to sceptical thinking. The big challenge is getting the information in front of your target audience in the first place.

    I wonder if any sceptical group has ever tried to invest significantly in Google Ads or on Facebook?

  3. MaryMon 15 Aug 2017 at 10:33 am

    The professional scicomm work keeps revealing what doesn’t work. They keep not telling us what will work. And I am also frequently irked to see the obituaries of the deficit model–when that’s not really the case. Some people are impacted by new and accurate information. But the number of scicomm pros that keep saying it’s dead is not helpful, and not a particularly good example of communication of the state of the field, ironically enough.

    But I also find that scicomm pros do not *want* some kinds of intervention to be effective. They don’t seem to want to study (or answer) whether discrediting cranks helps–because they seem to think it’s uncivil. Maybe it is, but that doesn’t mean it shouldn’t be studied. One of the best and most far-reaching pieces I saw was The Sci Babe’s “The Food Babe is full of sh*t” piece. That reached many new eyeballs, despite what some would describe as being uncivil. Also, the SciBabe has turned this style into a great and edgy platform–she now reaches into women’s magazines on a regular basis. Those used to be the worst of the food misinformation swamps.

    Scicomm pros don’t want to accept that shunning and mockery are possibly effective. I understand the discomfort with this, but again–it doesn’t mean we shouldn’t study it. I don’t think they’ve shown that it doesn’t work with proper studies.

    I was really livid at a recent piece in The Conversation that trashed Bill Nye and Neil DeGrasse Tyson. Yeah, maybe Nye’s show was not that effective–but Cosmos? But let’s see scicomm pros do the right show. Show me the metrics of reach. I keep asking what their best examples are, and they refuse to give me them. But the best part was in the comments–Neil responded to their criticism in an EPIC fashion. And the scicomm pros ignored it. Read his comment: https://theconversation.com/can-bill-nye-or-any-other-science-show-really-save-the-world-76630

    What pisses me off about the scicomm nannies: they aren’t showing what works, they are often just trashing efforts that they don’t seem to like. And they don’t like my tone. Fine–but show me your tone research. Until you have something useful, I don’t really like your tone either.

  4. fbrosseaon 15 Aug 2017 at 12:07 pm

    I don’t know how accurate it is but “The Oatmeal” did an amusing cartoon on the backfire effect

  5. mumadaddon 15 Aug 2017 at 2:46 pm

    fbrossea,

    Thanks for the link — funny and informative.

    PS. F*ck your whole worldview in the bumpipes!

    🙂

  6. Fair Persuasionon 15 Aug 2017 at 11:38 pm

    This study is really poor. A small group of Bachelors, Masters, and Ph.Ds from Italy and Scotland are being required to be impacted by public relations from a source from the Illinois Department of Health. This sounds like the authors wish to cross-pollinate Europe with Real Facts from Illinois, USA. Who needs to replicate this nonsense!

  7. SteveAon 16 Aug 2017 at 4:37 am

    fbrossea

    Thanks for the link. Great cartoon.

    Watch out! That gator has a knife!

  8. Fair Persuasionon 16 Aug 2017 at 3:31 pm

    News flash: University of Illinois reports genetic autism may be studied in human and bees’ behavior. Some genes of non-interactive bees have similarities to human autistic lack of interaction. Statistical genetic profiles show similarities.

  9. Bill Openthalton 17 Aug 2017 at 7:31 am

    Steven —

    The population needs to be more scientifically literate and more critical in its approach to claims and information.

    Does “the population” have the mental chops to be more scientifically literate and practice critical thinking?

  10. Pete Aon 19 Aug 2017 at 4:19 pm

    The Backfire Effect

    The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.

    The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.

    — David McRaney, You Are Not So Smart.

  11. bsooon 21 Aug 2017 at 8:39 pm

    I wonder how much of this is cultural. In general we make it very painful for people to be wrong and that seems like it would encourage people to dig in rather than accept new facts.

Trackback URI | Comments RSS

Leave a Reply

You must be logged in to post a comment.