Oct 12 2020


Psychologists in the UK have created a game that pre-debunks (or “pre-bunks”) COVID-19 conspiracy theories. The game is based on research that shows it can be more effective to give people information about how to identify conspiracy theories or misinformation before they are exposed to it. This is a fantastic idea, and I love the fact that this is being done in coordination with research to show if it is effective.

The current game is called Go Viral. It puts the player in the role of someone spreading conspiracy theories about the pandemic, and their goal is to make the misinformation go as viral as possible. This way the players learn the deceptive tactics of those who spread such misinformation by doing it themselves. This tactic reminds me of magicians who are skeptics. They have learned the techniques of deception, and have experienced how easy it can be to deceive people. Stage magic is essentially the practical art of misdirection, that exploits many of the weaknesses in our ability to perceive and construct an experience of what is happening. This puts magicians into a perfect position to detect deceptive practices on the part of others.

James Randi, for example, made a career out of exactly this. He has caught faith healers, for example, using standard mentalist tricks to deceive their audience. One example is the one-ahead trick. You have everyone fill out a “prayer card” with their basic information and what they want to pray for. All these cards are placed in envelopes and are then placed in a bowl, and the preacher draws them one by one “predicting” what each one will say prior to opening the envelope and “revealing” that they were correct. The audience is flabbergasted as the preacher, by seemingly divine means, knows all about them. However, the preacher is simply stating what they just read on the previous card. If you are a magician, this technique is easy to detect – and now you can detect it much easier because I just told you about it.

The Go Viral game preps players by giving them the same experience – imagine, for example, that you are playing the preacher and you actually do the one-ahead trick, and see for yourself how amazingly effective so simple a deception can be. The Go Viral game, in turn, is based on a previous pre-COVID iteration that was more generically about misinformation, called Bad News. The researchers have already published results with this game:

The game draws on an inoculation metaphor, where preemptively exposing, warning, and familiarising people with the strategies used in the production of fake news helps confer cognitive immunity when exposed to real misinformation. We conducted a large-scale evaluation of the game with N = 15,000 participants in a pre-post gameplay design. We provide initial evidence that people’s ability to spot and resist misinformation improves after gameplay, irrespective of education, age, political ideology, and cognitive style.

That found that even a single play of the game reduce susceptibility on average to fake news by 21%, and the effect was sustained for at least 3 months. The usual caveats apply – this was one study, although with a large N, and the authors acknowledge a number of limitations. But the results are in line with existing research. They definitely need to continue researching this question. There always seems to be more nuance when it comes to complex human behaviors, and many possible confounding factors. These results, however, also benefit from being extremely plausible, and again, in line with existing research.

This result is plausible for a number of reasons, including what I discussed above about magicians and skepticism. It is also the experience in general that having critical thinking skills, and specific knowledge about how pseudoscience and conspiracy theories operate, is definitely an effective protection against novel deceptions going forward. It also is consistent with what psychologists have found specifically in terms of confirmation bias and motivated reasoning. Once we hold a belief, we tend to engage in a number of cognitive behaviors biased toward maintaining that belief. Chief among them are confirmation bias and motivated reasoning. The former operates in the background of our perception, receptivity, and memory. In short we notice, accept, and remember bits of information that seem to confirm what we already believe (or want to believe), and we tend to ignore, forget, or explain away bits of information that contradict what we believe.

Motivated reasoning is more of a general term for the cognitive gymnastics we are willing to go through in order to maintain a desired belief. We can happily twist logic into pretzels if we need to in order to support those beliefs. The motivation is often provided by cognitive dissonance –  the negative feeling that results from holding two conflicting beliefs at the same time, or being confronted with information that challenges our beliefs or identity.

Therefore, it stands to reason that it should be more effective to “pre-bunk” a belief by giving someone the tools to recognize it for what it is, than to try to disabuse someone of a belief they have already formed. If someone has already entertained a bit of fake news, maybe even spread it to others, or stated it as a fact to their friends or family, cognitive dissonance would kick it if they later found out it was indeed fake. But if they recognize it as fake up front, then they get to feel good about themselves for doing so and weeding it out.

If humans were perfectly logical creatures, none of this would matter. All that would matter is the quality of the information and what facts and logic dictate.  But we are also emotional creatures with a host of biases. Pre-bunking can be viewed as a technique for avoiding cognitive biases that would favor disinformation, and turning those biases around so that they favor critical thinking. This is basically skepticism in a nutshell.

No responses yet