Feb 27 2020

Anti-Intellectualism and Rejecting Science

“There is a cult of ignorance in the United States, and there has always been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.”
― Issac Asimov

As science-communicators and skeptics we are trying to understand the phenomenon of rejection of evidence, logic, and the consensus of expert scientific opinion. There is, of course, no one explanation – complex psychological phenomena are likely to be multifactorial. Decades ago the blame was placed mostly on scientific illiteracy, a knowledge deficit problem, and the prescription was science education. Many studies over the last 20 years or so have found a host of factors – including moral purity, religious identity, ideology, political identity, intuitive (as opposed to analytical) thinking style, and a tendency toward conspiratorial thinking. And yes, knowledge deficit also plays a role. These many factors contribute to varying degrees on different issues and with different groups. They are also not independent variables, as they interact with each other.  Religious and political identity, for example, may be partially linked, and may contribute to a desire for moral purity.

Also, all this is just one layer, mostly focused on explaining the motivation for rejecting science. The process of rejection involves motivated reasoning, the Dunning-Kruger effect, and a host of self-reinforcing cognitive biases, such as confirmation bias. Shameless plug – for a full discussion of cognitive biases and related topics, see my book.

So let’s add one more concept into the mix: anti-intellectualism – the generalized mistrust of intellectuals and experts. This leads people to a contrarian position. They may consider themselves skeptics, but they do not primarily hold positions on scientific issues because of the evidence, but mainly because it is contrary to the mainstream or consensus opinion. If those elite experts claim it, then it must be wrong, so I will believe the opposite. This is distinct from conspiracy thinking, although there is a relationship. As an aside, what the evidence here shows is that some people believe in most or all conspiracies because they are conspiracy theorists. Others believe only in some conspiracies opportunistically, because it’s necessary to maintain a position they hold for other reasons. There is therefore bound to be a lot of overlap between anti-intellectualism and holding one or more conspiracies, but they are not the same thing.

There is a new paper which sheds some light on anti-intellectualism itself. In a series of studies, researcher Eric Merkley found

I provide evidence of a strong association between anti-intellectualism and opposition to scientific positions on climate change, nuclear power, GMOs, and water fluoridation, particularly for respondents with higher levels of political interest. Second, a survey experiment shows that anti-intellectualism moderates the acceptance of expert consensus cues such that respondents with high levels of anti-intellectualism actually increase their opposition to these positions in response. Third, evidence shows anti-intellectualism is connected to populism, a worldview that sees political conflict as primarily between ordinary citizens and a privileged societal elite.

He concludes that anti-intellectual messaging affects how we process information. Let’s unpack this a bit. The first claim seems fairly straightforward and unsurprising – independent measures of anti-intellectualism predict opposition to mainstream scientific views, especially when there is a political implication. This implies that anti-intellectualism is a real thing, it is not just a label placed on those who reject consensus scientific views for perfectly valid reasons. It reflects a style of thinking that leads to or at least facilitates such rejection despite the evidence.

For example, someone may be motivated by their political ideology, world view, and tribal allegiance to reject the strong scientific consensus that existing GMO crops are safe. This is likely to set up some cognitive dissonance, with the political motivations on one side of the scale, and the scientific evidence on the other. A host of factors are then piled onto each side of the scale, and one side typically “wins.” If you have a strong dedication to objective expert opinion, that will weigh heavily on the GMO side. If you are a conspiracy theorist or an anti-intellectual, you may have no problem rejecting that consensus, and in fact may prefer it.

Again, the evidence shows this is largely a two-step process. The first step is sorting out your motivation (this is mostly subconscious). Which side do you want to believe? Which makes you feel better? The second step is rationalizing the first step, justifying your choice, which was largely based on feelings and intuition. This is where it gets complicated, because some of the factors listed above may contribute to both steps. Conspiracies, again, are the most dramatic example. For some people they are motivated to believe in conspiracies, while for others they are simply a convenient justification for beliefs they hold for other reasons.

The second study implies that for anti-intellectualism, it is mostly about motivation – having anti-intellectual world views means you want to reject expert consensus for its own sake. In fact, having anti-intellectual views correlates with rejecting expert opinion, even if it is unrelated to existing ideological views, to the point that in this study there was a backfire effect. Not only did those with anti-intellectual view reject the consensus, they were actively moved away from the consensus.

The third study has implications for our political environment – populism is a view that includes the notion that the main conflict in the world is between ordinary citizens and a villainous elite. This becomes a war against elitism and expertise itself. Populism is both motivation and justification – for once you reject expertise, you can believe anything you want.

We see this on both sides of the political spectrum. (Usual caveat – the point of this is not to be political or to imply any equivalency, just to highlight this phenomenon.) Donald Trump, for example, rejects the consensus on vaccines and global warming. Bernie Sanders rejects the consensus on nuclear power and GMOs and is favorable toward alternative medicine.

As with many things, you can often find a kernel of truth to any position. People are generally good at motivated reasoning, which includes finding those kernels and then magnifying them into a justification for whatever position you want to hold. Experts and elites do make mistakes, they are also people with their own motivations, and any power will be abused to some extent by someone. That is precisely why we have institutions and regulations. We do not have a rule of flawed people, but a rule of law. Science is not just about individual researchers, but elaborate policing by a community. Our complex society requires multiple overlapping institutions to keep each other transparent and honest. It’s messy and imperfect, but the alternative is horrific.

That alternative is populist rejection of not only experts, but the institutions of expertise and the concept of expertise itself. This leads to intellectual anarchy (often justified by portraying it as intellectual freedom, but that is not the issue and entirely misses the point). The populist view is mostly about believing what feels good, going along with an explanatory narrative that makes some kind of sense of a complex and scary world and organizes that understanding around vilifying an enemy, who is to blame for our problems. What’s scary is that our political and media institutions may favor such simplistic and appealing populist narratives, and disadvantage more nuanced approaches.

No responses yet