Aug 05 2021

The Misinformation Trifecta

Misinformation is a booming industry, and that is often exactly what it is. Sometimes it may emerge organically, out of sincere error or misunderstanding. But increasingly misinformation is being weaponized to achieve specific goals. That goal might be to protect the interests of a corporation or industry, to promote a candidate or particular policy position, to engage in a broader culture-war, promote an ideology, or just sell a brand or product. Recently it has become more apparent to me that often there is a common strategy to weaponizing misinformation. It’s likely always been there, but is getting more blatant.

The “misinformation trifecta” combines three synergistic strategies. The first is spreading the misinformation itself, factual claims that are wrong, misleading, fabricated, cherry-picked, or simply indifferent to the truth. This is the “payload”, if you will. For example, claiming that GMOs are harmful to one’s health is simply wrong. There is no evidence to support this, and in fact there is copious evidence that GMO crops are as safe and healthful and their non-GMO counterparts. This bit of information is used to promote a certain ideology (based on the appeal to nature fallacy) and a competitive brand, organic farming. It is often packaged with lots of other bits of misinformation, all woven together into a certain narrative.

Narratives are powerful because they tend to take on a life of their own. They organize misinformation into a story, and people have an easier time understanding, remembering, and relating to stories rather than isolated facts. Narratives also provide a lens through which we view reality. Once you have sold someone on a narrative, they then curate their own misinformation to further support the narrative.

If misinformation itself were the only issue, then the solution would be straightforward. This is the “knowledge deficit” model, where misinformation is corrected with better, more accurate or complete information. Past efforts at public education and countering misinformation have focused on this knowledge deficit approach. This has a limited (but non-zero) effect. It is highly variable depending on the particular topic, but mostly it is inadequate to counter misinformation.

The failure of the knowledge deficit approach is largely due to the second component of the misinformation trifecta – we can conceptualize this in several ways: the death of expertise, attacks on the notion of truth itself, denialism, conspiracy theories, false balance, or alternative facts. The goal here is to create a world where there are no experts, where everything is opinion, and no institution or authority can be trusted. Every claim can be countered with an alternative claim, and every expert opposed by a dueling expert. You have your links, I have my links. This essentially shuts down any attempt at correcting misinformation with accurate information, because there is no such thing as “accurate”.

This is why one of the most common and powerful moves in the authoritarian playbook is to dismantle the institutions of objectivity and transparency: the press, academia, science, and professional organizations.  They are either destroyed or made slaves to the state. Authoritarian regimes depend on the ability to make reality whatever they say it is. Truth is based entirely on their authority, and they cannot risk contradiction from objective and trusted sources outside their control. This is an extreme (but real-world) example, but it makes the point. Any vested interest that wants to push public perception of what is truth in a direction that favors their interests, can use this same approach. Their tools may vary, using marketing firms, for example, but the goal is the same.

The bottom line is always the same, you cannot trust those experts. They are “arrogant elites” or are part of some sinister conspiracy. They are shills in the pocket of the villain de jour, or are themselves the villains. There is no apparent limit on how absurd such anti-expertise narratives can get. NASA, apparently, is engaged in a generational world-wide conspiracy to hide the fact that the Earth is really flat. Climate scientists created the global warming hoax to tweak their research funding. Vaccines are a conspiracy to depopulate the planet.

The challenge with this anti-expertise leg of the trifecta is that it is very asymmetrical. If anything, the truth has an advantage when it comes to misinformation vs accurate information. Accurate information has the advantage of being true, but also is often backed by trusted individuals and institutions. I think that is precisely why misinformation campaigns eventually go for the next piece of the puzzle – attacking the very experts who can expose your misinformation. Here those pushing misinformation have the advantage, because all they have to do is create doubt and confusion. They don’t need to prove their hypothesis, or even have a coherent or even internally consistent theory. They just need to create confusion. This is much easier to do than it is to create confidence. It also puts the experts on the defensive – now they are the ones with the burden of proof, not only of the facts but of their own legitimacy.

It’s also easy to play at being a fake skeptic. Just keep asking more and more questions (a strategy called sea lioning). You don’t even have to make claims, the questions themselves serve the purpose of undermining confidence in expertise and facts. Also, reality is complicated, and honest answers are often highly nuanced. So any dedicated campaign of denial can find legitimate weaknesses in any scientific theory or base of evidence. We often have legitimate doubt and uncertainty, which can be magnified unreasonably and taken out of context, while ignoring the weaknesses of any alterative narrative. Again, the “merchants of doubt” don’t have to prove anything, just foment confusion. Simple narratives also sell better than complex ones, and reality is complex.

The goal is to have people throw up their arms in surrender. I guess we can’t trust anyone or really be confident in any information. No source is better than any other source. So I might as well believe in the narrative that makes me feel good, or that supports my tribe or identity. This takes reality out of the hands of professionals and experts and puts it in the hands of marketers, politicians, and con artists – those with a good story to sell.

Again, there is a countermove, and again there is a “check-mate” response. In an attempt to counter the anti-expertise strategy of misinformation, one can always appeal to mechanisms of quality control, to standards. There are scientific standards, journalistic standards, academic standards, and professional standards. These standards are promoted by communities of experts and professionals, usually with some formal process of evaluation. Peer-review is one obvious example. But here comes the third component of the trifecta – all attempts at quality control can be attacked as an assault on free speech and some form of censorship.

This also feeds back nicely into the second strategy. If an article which is utter scientific dreck is not published in a scientific journal, then the journal is “censoring” free speech. No they aren’t. Their editorial decisions are a manifestation of their own free speech, and they have a right, and even a duty, for imposing quality control. But it is easy to recast this as a conspiracy – the “lamestream” media is suppressing this story, scientists are closed-minded to truly novel ideas, the “powers-that-be” are hiding the truth. What are they afraid of? Just publish everything and let the public make up their mind.

Appeals to free speech have a lot of legs, especially in an open society. This is also a legitimately complex area, which requires balance between creating a space where free discussion and the marketplace of ideas can play out, while still imposing at least a minimal filter of quality control so the conversation is not just noise. Also, the free speech argument has a major unstated and demonstrably false premise – that all players are acting in good faith. They often are not. Open the flood gates, and you will let in every con-artist, everyone with an agenda, every crank, and every psychopath (of course, I just described social media). Highly motivated bad-faith actors can break any medium made for the exchange of ideas and information.

Free speech claims also contain an element of gratuitous flattery – you are smart enough to decide for yourself. You don’t need anyone else to filter or curate information for you. I am just trying to empower you with alternative views and other information. This is weaponizing the Dunning-Kruger effect. We all need to have the humility to understand that we are not experts in most things, and the gulf between novice and expert can be immense. Also, can anyone take an honest look at the world today and argue with a straight face that we could not benefit from a little quality control in our information ecosystem? How many people are refusing to get vaccinated against a deadly pandemic because they are full of misinformation, distrust of experts, conspiracy theories, and an inflated assessment of their own expertise?

The ultimate solution, I would argue, is education. We need to promote scientific literacy, media savvy, and critical thinking skills. But we cannot do that in a world where there is no expertise, or in a democracy where voters don’t already have a sufficient amount of those things. This is why we need to promote quality control not only from the bottom up, but from the top down. And I understand why that makes people nervous – it makes me nervous. But we already have institutions with a history of professionalism that know how to do this. We can and should do better, but it is possible to have both freedom and quality control in a reasonable balance.

No responses yet