Jan 15 2019

Dunning Kruger and GMO Opposition

I have written extensively about GMOs (gentically modified organisms) here, and even dedicated a chapter of my book to the topic, because it is the subject about which the difference between public opinion and the opinion of scientists is greatest (51%). I think it’s clear that this disparity is due to a deliberate propaganda campaign largely funded by the organic lobby with collaboration from extreme environmental groups, like Greenpeace.

This has produced an extreme, if not a unique, challenge for science communicators. Also – there are direct implications for this, as the political fight over GMO regulation and acceptance is well underway. The stakes are also high as we are facing challenges feeding a growing population while we are already using too much land and there really isn’t more we can press into agriculture. (Even if there are other ways to reduce our land use, that does not mean we should oppose a safe and effective technology that can further reduce it.)

A new study published in Nature may shed further light on the GMO controversy. The authors explore the relationship between knowledge about genetics and attitudes toward GMOs.

In a nationally representative sample of US adults, we find that as extremity of opposition to and concern about genetically modified foods increases, objective knowledge about science and genetics decreases, but perceived understanding of genetically modified foods increases. Extreme opponents know the least, but think they know the most.

Readers of this blog may recognize this pattern as the Dunning-Kruger effect (DK) – the less someone knows about a topic the greater they overestimate their knowledge. Actually the pattern the current authors found is even more extreme that DK. In DK self-estimation still goes down with objective knowledge, but the degree to which people overestimate their knowledge increases. The current study is super DK – at the extreme end of opposition to GMOs, opponents actually knew the least about genetics but thought they knew the most. As the authors say:

Moreover, the relationship between self-assessed and objective knowledge shifts from positive to negative at high levels of opposition.

This does not happen with ordinary DK. The effect was also robust and widespread:

Similar results were obtained in a parallel study with representative samples from the United States, France and Germany, and in a study testing attitudes about a medical application of genetic engineering technology (gene therapy). This pattern did not emerge, however, for attitudes and beliefs about climate change.

That’s very interesting, that the reversal effect was not seen with climate change. There was still an ordinary DK effect, just not this super DK reversal of knowledge and confidence. What all this means is that while ordinary DK effect is in play here, it does not completely explain the results. Something else is going on with anti-GMO propaganda. That is worth exploring further.

It is worth noting that the reversal effect did not kick in until we got to extreme opposition to GMOs. As I said above, misinformation about GMOs is greater than any other topic (including global warming, even evolution) and this is probably not a coincidence. It may have something to do with the topic itself, that there is general ignorance about genetics, and also lots of fear and misinformation about genetic mutations and manipulation in general.

Previous surveys about GMO knowledge support this. For example:

50% thought that GMO tomatoes have genes, while ordinary tomatoes do not.

41% believe that eating a GM tomato would change a person’s genes

68% believe that GM food genes can become incorporated into a person’s genes permanently and be passed down to future generations.

That’s some massive misinformation there. Interestingly, while this current study did not find a super DK for climate change, a previous study did find this effect for vaccines – extreme vaccine opponents knew the least but thought they knew the most.

This may be saying something about the nature of science-denying campaigns or the topics themselves. Climate change denial is mostly about political affiliation, and denial is based largely on opposition to proposed solutions.

Antivaccine and anti-GMO attitudes, on the other hand, are bipartisan. Further, they are based largely on fear driven by scientific misinformation. That may be the key – misinformation is different than mere ignorance. Misinformation gives the illusion of knowledge, and because it goes against the mainstream this false knowledge makes one feel superior to others (you know stuff the masses don’t), hence the super DK with an actual reversal of knowledge and confidence.

This fits my personal experience as well. When talking with global warming deniers they tend to focus on the politics – global warming is an attempt by liberals to control the markets, etc. When talking to those who are anti-GMO or anti-vaccine, they also cite conspiracy theories, but they are more likely to justify their opposition by citing bad research or bringing up misunderstood scientific tidbits. (This happens for global warming denial as well, just not as much.) This, in turn, may be a result of the topics themselves, or the tactics emphasized by the opposition campaigns. Do they focus on ideology, or on promoting bad science?

I am speculating here and trying to extrapolate from my experience, in order to explain the difference in this study between GMOs and climate change. This is a topic worthy of further research to clarify what is really going on.

It is also a reason for personal caution – be wary of being trapped in a slick propaganda campaign that will give you the false illusion of being informed, when in fact you are being systematically misinformed. The (ironically named) web is functions to create such traps, drawing you down a rabbit hole of ideologically biased misinformation.

A partial remedy is to build into your own behaviors safety valves that will protect you from falling down such holes. This means being skeptical of extreme claims, conspiracy theories, or claims to special knowledge. It also means – when you find yourself moving in one direction with regards to a controversy, deliberately slow down and seek out information from the other side. You don’t really know what is going on until you find out what all sides believe, and why, and how they respond to the claims of the other side.

The frequent example here is, imagine listening to just the defense or the prosecution in a trial, then making a decision without listening to the other side. How fair do you think your decision would be? You could be convinced of almost anything if you only listen to one side, and the more developed, sophisticated, and well-funded that side is, the more extreme of a view they can persuade you into.

Unless you protect yourself with skepticism (scientific literacy helps too).

No responses yet