Search Results for "backfire effect"

Jan 04 2018

Backfire Effect Not Significant

Previous research has shown that when confronted with a factual statement that appears to go against an ideologically held belief, a percentage of people tested will move their position away from the factual information – a so-called “backfire effect.” This notion was rapidly incorporated into the skeptical narrative, because it seems to confirm our perception that it is very difficult to change people’s minds.

However, more recent research suggests that the backfire effect may not exist, or at least is exceedingly rare. A recently published series of studies puts a pretty solid nail in the coffin of the backfire effect (although this probably won’t be the last word).

To be clear, people generally still engage in motivated reasoning when emotions are at stake. There is clear evidence that people filter the information they seek, notice, accept, and remember. Ideology also predicts how much people will respond to factual correction.

The backfire effect, however, is very specific. This occurs when people not only reject factual correction, but create counterarguments against the correction that move them further in the direction of the incorrect belief. It’s probably time for us to drop this from our narrative, or at least deemphasize it and put a huge asterisk next to any mention of it.

Continue Reading »

13 responses so far

Aug 15 2017

More on the Backfire Effect

Published by under Skepticism

mythsOne critical question for the skeptical enterprise is the notion of a backfire effect – when someone is given factual information about a myth that they believe, do they update and correct their beliefs or do they dig in their heels and believe the myth even stronger? Some studies worryingly show that sometimes people dig in their heels, or they simply misremember the corrective information.

A new study sheds further light on this question, although it is not, by itself, the definitive final answer (one study rarely is).

For background, prior studies have show several effects of interest. First, from memory research we know that people store facts separate from the source of those facts and from the truth-status of those facts. That is why people will often say, “I heard somewhere that…” They may not even remember if the tidbit is true or not, but the idea itself is much easier to remember, especially if it is dramatic or resonates with some narrative or belief.

So, if you tell someone that there is no evidence linking vaccines and autism, they are most likely to remember something about a link between vaccines and autism, but not remember where the information came from or if the link is real. That, at least, is the concern.

The research on this topic is actually a bit complex because there are numerous variables. There are factors about the subjects themselves, their age, their baseline beliefs, their intentions, and the intensity of their beliefs. There are different types of information to give: positive information (vaccines are safe), dispelling negative information, graphic information, and fear-based information (pictures of sick unvaccinated kids). There are different topics – political vs scientific, with different levels of emotional attachment to the beliefs. There is belief vs intention – do you think vaccines are safe vs do you intend to vaccinate your kids? Finally there is time, immediate vs delayed effects.

Continue Reading »

11 responses so far

Aug 03 2020

Do Your Own Research?

A recent commentary on Forbes advises: You Must Not ‘Do Your Own Research’ When It Comes To Science. I agree  with everything the author, Ethan Siegel, says in the piece. It was a good start – but did not go far enough. For example, he did not really reach any conclusion about what people should actually do, beyond “listen to the experts.” OK – how, exactly, do we do that? This is not a criticism (I have written similar articles before) but an observation: after trying to communicate these same skeptical themes for decades and getting thousands of questions from the public, I have realized that it is perhaps not so obvious what it means to listen to the experts.

First let me amplify what Siegel gets right, although I may reframe it a bit. He correctly describes the typical process that people use when evaluating new information, although does not name it – confirmation bias. His summary is as good as any:

  • formulating an initial opinion the first time we hear about something,
  • evaluating everything we encounter after that through that lens of our gut instinct,
  • finding reasons to think positively about the portions of the narrative that support or justify our initial opinion,
  • and finding reasons to discount or otherwise dismiss the portions that detract from it.

Continue Reading »

No responses yet

Feb 27 2020

Anti-Intellectualism and Rejecting Science

“There is a cult of ignorance in the United States, and there has always been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.”
― Issac Asimov

As science-communicators and skeptics we are trying to understand the phenomenon of rejection of evidence, logic, and the consensus of expert scientific opinion. There is, of course, no one explanation – complex psychological phenomena are likely to be multifactorial. Decades ago the blame was placed mostly on scientific illiteracy, a knowledge deficit problem, and the prescription was science education. Many studies over the last 20 years or so have found a host of factors – including moral purity, religious identity, ideology, political identity, intuitive (as opposed to analytical) thinking style, and a tendency toward conspiratorial thinking. And yes, knowledge deficit also plays a role. These many factors contribute to varying degrees on different issues and with different groups. They are also not independent variables, as they interact with each other.  Religious and political identity, for example, may be partially linked, and may contribute to a desire for moral purity.

Also, all this is just one layer, mostly focused on explaining the motivation for rejecting science. The process of rejection involves motivated reasoning, the Dunning-Kruger effect, and a host of self-reinforcing cognitive biases, such as confirmation bias. Shameless plug – for a full discussion of cognitive biases and related topics, see my book.

So let’s add one more concept into the mix: anti-intellectualism – the generalized mistrust of intellectuals and experts. This leads people to a contrarian position. They may consider themselves skeptics, but they do not primarily hold positions on scientific issues because of the evidence, but mainly because it is contrary to the mainstream or consensus opinion. If those elite experts claim it, then it must be wrong, so I will believe the opposite. This is distinct from conspiracy thinking, although there is a relationship. As an aside, what the evidence here shows is that some people believe in most or all conspiracies because they are conspiracy theorists. Others believe only in some conspiracies opportunistically, because it’s necessary to maintain a position they hold for other reasons. There is therefore bound to be a lot of overlap between anti-intellectualism and holding one or more conspiracies, but they are not the same thing.

Continue Reading »

No responses yet

Aug 19 2019

Facts vs Stories

There is a common style of journalism, that you are almost certainly very familiar with, in which the report starts with a personal story, then delves into the facts at hand often with reference to the framing story and others like it, and returns at the end to the original personal connection. This format is so common it’s a cliche, and often the desire to connect the actual new information to an emotional story takes over the reporting and undermines the facts.

This format reflects a more general phenomenon – that people are generally more interested in and influenced by a good narrative than by dry facts. Or are we? New research suggests that while the answer is still generally yes, there is some more nuance here (isn’t there always?). The researchers did three studies in which they compared the effects of strong vs weak facts presented either alone or embedded in a story. In the first two studies the information was about a fictitious new phone. The weak fact was that the phone could withstand a fall of 3 feet. The strong fact was that the phone could withstand a fall of 30 feet. What they found in both studies is that the weak fact was more persuasive when presented embedded in a story than along, while the strong fact was less persuasive.

They then did a third study about a fictitious flu medicine, and asked subjects if they would give their e-mail address for further information. People are generally reluctant to give away their e-mail address unless it’s worth it, so this was a good test of how persuasive the information was. When a strong fact about the medicine was given alone, 34% of the participants were willing to provide their e-mail. When embedded in a story, only 18% provided their e-mail.

So, what is responsible for this reversal of the normal effect that stories are generally more persuasive than dry facts? The authors suggest that stories may impair our ability to evaluate factual information. This is not unreasonable, and is suggested by other research as well. To a much greater extent than you might think, cognition is a zero-sum game. When you allocate resources to one task, those resources are taken away from other mental tasks (this basic process is called “interference” by psychologists). Further, adding complexity to brain processing, even if this leads to more sophisticated analysis of information, tends to slow down the whole process. And also, parts of the brain can directly suppress the functioning of other parts of the brain. This inhibitory function is actually a critical part of how the brain works together.

Continue Reading »

No responses yet

Aug 01 2019

GMOs and the Knowledge Deficit Model

A 2015 Pew survey found that 88% of AAAS scientists believe that GMOs (genetically modified organisms) are generally safe to eat, while only 37% of the general public did. This was the biggest gap, 51%, of any science attitude they surveyed – greater than evolution or climate change. This hasn’t changed much since. A 2018 Pew survey found that 49% of US adults think that GMOs are worse for your health. These numbers are also similar in other countries.

An important underlying question for science communicators is – what is the source and therefore potential solution to this disconnect between experts and the public? In other words – what drives anti-scientific or pseudoscientific attitudes in the public? The classic answer is the knowledge deficit model, that people reject science because they don’t understand it. If true, then the answer is science education and fostering greater scientific literacy.

However, psychological research over the last two decades has called into question the knowledge deficit model. Studies have found that giving facts often has not result, or may even create a backfire effect (although to scope of this is still controversial). Some research suggests you have to confront a person’s explanatory narrative and replace it with another. Others indicate that ideological beliefs are remarkably resistant to alteration with facts alone.

But the knowledge deficit model is not dead yet. It seems that we have to take a more nuanced approach to unscientific beliefs in the public. This is a heterogeneous phenomenon, with multiple causes and therefore multiple potential solutions. For each topic we need to understand what is driving that particular belief, and then tailor an approach to it.

Continue Reading »

No responses yet

Jan 18 2019

GM Foods and Changing Minds

The question at the core of science communication and the skeptical movement is – how do we change opinions about science-related topics? That is the ultimate goal, not just to give information but to inform people, to change the way they think about things, to build information into a useful narrative that helps people understand the world and make optimal (or at least informed) decisions.

I have been using the GMO (genetically modified food) issue as an example, primarily because the research I am discussing is using it as a topic of study. But also – GMO opposition is the topic about which there is the greatest disparity between public and scientific opinion. A new study also looks at attitudes toward GMOs, specifically, with the question of – is a convert from GMO opponent to supporter more persuasive than straightforward GMO support?

The study uses clips from a talk by Mark Lynas, an environmentalist who converted from GMO opponent to supporter. They found:

The respondents each were shown one of three video clips: 1) Lynas explaining the benefits of GM crops; 2) Lynas discussing his prior beliefs and changing his mind about GM crops; and 3) Lynas explaining why his beliefs changed, including the realization that the anti-GM movement he helped to lead was a form of anti-science environmentalism.

The researchers found that both forms of the conversion message (2 and 3) were more influential than the simple advocacy message. There was no difference in impact between the basic conversion message and the more elaborate one.

This makes sense – prior research shows that it is more effective to give someone a replacement explanatory narrative than just to tell them that they are wrong. However, it is very difficult to say how generalizable this effect is.

Continue Reading »

No responses yet

Jun 01 2017

Confirmation Bias vs Desirability Bias

Published by under Neuroscience

donald-trump-vs-hillary-clinton-top-issuesThe human brain is plagued with cognitive biases – flaws in how we process information that cause our conclusions to deviate from the most accurate description of reality possible with available evidence. This should be obvious to anyone who interacts with other human beings, especially on hot topics such as politics or religion. You will see such biases at work if you peruse the comments to this blog, which is rather a tame corner of the social media world.

Of course most people assume that they are correct and everyone who disagrees with them is crazy, but that is just another bias.

The ruler of all cognitive biases, in my opinion, is confirmation bias. This is a tendency to notice, accept, and remember information which appears to support an existing belief and to ignore, distort, explain away, or forget information which seems to disconfirm an existing belief. This process works undetected in the background to create the powerful illusion that the facts support our beliefs.

If you are not aware of confirmation bias and do not take active steps to avoid it, it will have a dramatic effect on your perception of reality.  Continue Reading »

48 responses so far

Apr 22 2016

Illegal Immigration and the Law of Unintended Consequences

Published by under General Science

Undocumented immigrantsOne thing I have learned as a science communicator over the last two decades, trying to digest many areas of science, is that stuff is complicated. It is a good rule of thumb that everything is more complicated than you might originally think.

This complexity takes various forms. First, unless you are at the leading edge of expertise in an area, your understanding of that topic is relatively superficial. There is greater depth and nuance than your current understanding, which is likely a necessary simplification.

Second, there are few clean answers in science. Some things, obviously, are well established to the point that we can treat them as facts, but many more things than we might naively suppose are controversial on some level. The evidence is mixed, imperfect, and incomplete and there remain various opinions about how to interpret the data.

As an aside, this is one of my peeves about how science is often communicated. A complex debate is distilled down to, scientists think X (representing just one side of that debate). Each time a new study is published apparently supporting one position, that position is now correct and the others are now wrong. All the nuance is lost.

Continue Reading »

80 responses so far

Aug 04 2015

Convincing Antivaxxers

A new study has been published in PNAS exploring methods for changing the attitudes of those who are anti-vaccine. The results differ from a previous study published last year in Pediatrics. Let’s explore their methods and results.

Both studies questioned subjects about their attitudes toward vaccines and their willingness to vaccinate their children. The Pediatrics study was web-based and recruited 1759 parents. They divided them into four groups:

(1) information explaining the lack of evidence that MMR causes autism from the Centers for Disease Control and Prevention; (2) textual information about the dangers of the diseases prevented by MMR from the Vaccine Information Statement; (3) images of children who have diseases prevented by the MMR vaccine; (4) a dramatic narrative about an infant who almost died of measles from a Centers for Disease Control and Prevention fact sheet; or to a control group.

The PNAS study was in person, but only recruited 315 subjects. They divided people into three groups: 1) given information debunking vaccine myths, 2) told about the risks of measles and shown graphic images, 3) control group given information unrelated to vaccines.

Continue Reading »

19 responses so far

Next »