Jan 18 2019

GM Foods and Changing Minds

The question at the core of science communication and the skeptical movement is – how do we change opinions about science-related topics? That is the ultimate goal, not just to give information but to inform people, to change the way they think about things, to build information into a useful narrative that helps people understand the world and make optimal (or at least informed) decisions.

I have been using the GMO (genetically modified food) issue as an example, primarily because the research I am discussing is using it as a topic of study. But also – GMO opposition is the topic about which there is the greatest disparity between public and scientific opinion. A new study also looks at attitudes toward GMOs, specifically, with the question of – is a convert from GMO opponent to supporter more persuasive than straightforward GMO support?

The study uses clips from a talk by Mark Lynas, an environmentalist who converted from GMO opponent to supporter. They found:

The respondents each were shown one of three video clips: 1) Lynas explaining the benefits of GM crops; 2) Lynas discussing his prior beliefs and changing his mind about GM crops; and 3) Lynas explaining why his beliefs changed, including the realization that the anti-GM movement he helped to lead was a form of anti-science environmentalism.

The researchers found that both forms of the conversion message (2 and 3) were more influential than the simple advocacy message. There was no difference in impact between the basic conversion message and the more elaborate one.

This makes sense – prior research shows that it is more effective to give someone a replacement explanatory narrative than just to tell them that they are wrong. However, it is very difficult to say how generalizable this effect is.

Several decades ago science communicators (like Carl Sagan) operated under the “knowledge deficit” model, that the primary reason people believed pseudoscience was their lack of knowledge about science. Research and experience in the last few decades has shown this to be largely incorrect. However, I would not say that knowledge deficit has no effect, just that it is an insufficient explanation for embrace of bad science or rejection of science.

Primarily what we have learned from the research is that not all issues are created equal. The factors that cause science denial or embrace of pseudoscience depend on the specific topic. Those primary internal factors seem to be – lack of scientific knowledge, ideology (political, religious, etc.) or distrust of scientific institutions (perhaps part of a more general conspiracy theory mindset). There are also external factors, such as misinformation campaigns, celebrity endorsement, dramatic single cases, and perverse incentives.

I previously discussed one study that tried to look at various factors as they relate to different topics. They found:

They found that climate change denial was predicted mainly by political ideology, but not by low scientific literacy. Vaccine rejection was predicted by low scientific literacy and low faith in science, and also by religiosity and moral purity. Distrust of GM food was predicted by low scientific literacy and low faith in science. Neither vaccine or GM food rejection were predicted by political ideology.

If a specific topic relates primarily to one’s religious faith or political ideology, then the knowledge deficit model appears to be completely irrelevant. Lack of knowledge does not predict science denial, and correcting misinformation does not reduce science denial. Further, providing correct scientific information has no effect on reducing science denial, and in extreme cases may even cause a backfire effect (although this is still controversial).

But on topics that are not primarily ideological, like opposing GM foods or vaccines, the pattern is very different. Lack of scientific knowledge does predict science denial – the knowledge deficit model has some life here. But even with these topics knowledge deficit is not enough. We have to distinguish a simple lack of information with active misinformation.

It is also important not only to understand the misinformation, but how it works within an explanatory system. Such an explanatory system is not as strong as an ideology or religious belief, but it is similar. Environmentalism or being suspicious of the government are examples – they are worldviews, or ways of making sense of the chaos, but they are not a strong source of identity or community (although they can be in extreme cases).

In these situations, like with GMOs and vaccine hesitancy, teaching the underlying science is helpful, and correcting misinformation and myths is helpful, as long as they are part of providing an alternative explanatory narrative – one that is consistent with the science and follows principles of logic and critical thinking.

There is also a difference between preventing someone from falling into pseudoscience and trying to pull someone out of a science-denying hole they have already fallen into. Prevention is easier and more effective than correction.

What all this means is that there is some utility to having a more scientifically literate populace. Half of the public thinks that non-GMO tomatoes do not have genes. That is a shocking level of scientific illiteracy, and is totally correctable. But we also need to teach basic critical thinking as part of scientific literacy.

Finally – we need to confront misinformation campaigns and correct the myths they generate. This is full time work for science communicators, but also mainstream scientific and professional organizations need to make this part of there regular activity. Scientific institutions need to interface with the public, and actively correct misinformation.

So why are conversion stories, like that of Mark Lynas, more effective? Probably because it addresses the misinformation, not just provide scientific information. It also shows that previously Mark relied upon one explanatory narrative. He then found that this narrative was biased and flawed, and he replaced it with a more scientific one. He modeled the process of change that people can follow to arrive at a more stable scientific belief.

There is still a lot to research, because there is a lot of complexity here, but it is great that social psychologists and other scientists have been directly exploring these issues. The science of science communication has been advancing nicely.

 

No responses yet