Feb 10 2025

Who Believes Misinformation

It’s probably not a surprise that a blog author dedicated to critical thinking and neuroscience feels that misinformation is one of the most significant threats to society, but I really to think this. Misinformation (false, misleading, or erroneous information) and disinformation (deliberately misleading information) have the ability to cause a disconnect between the public and reality. In a democracy this severs the feedback loop between voters and their representatives. In an authoritarian government it a tool of control and repression. In either case citizens cannot freely choose their representatives. This is also the problem with extreme jerrymandering – in which politicians choose their voters rather than the other way around.

Misinformation and disinformation have always existed in human society, and it is an interesting question whether or not they have increased recently and to what extent social media has amplified them. Regardless, it is useful to understand what factors contribute to susceptibility to misinformation in order to make people more resilient to it. We all benefit if the typical citizen has the ability to discern reality and identify fake news when they see it.

There has been a lot of research on this question over the years, and I have discussed it often, but it’s always useful to try to gather together years of research into a single systematic review and/or meta-analysis. It’s possible I and others may be selectively choosing or remembering parts of the research to reinforce a particular view – a problem that can be solved with a thorough analysis of all existing data. And of course I must point out that such reviews are subject to their own selection bias, but if properly done such bias should be minimal. The best case scenario is for there to be multiple systematic reviews, so I can get a sense of the consensus of those reviews, spreading out bias as much as possible in the hopes it will average out in the end.

With that in mind, there is a recent meta-analysis of studies looking at the demographics of susceptibility to misinformation.  The results mostly confirm what I recall from looking at the individual studies over the years, but there are some interesting wrinkles. They looked at studies which used the news headline paradigm – having subjects answer if they think a headline is true or not, “totaling 256,337 unique choices made by 11,561 participants across 31 experiments.” That’s a good chunk of data. First, people were significantly better than chance at determining which headlines were true (68.51%) or false 67.24%). That’s better than it being a coin flip, but still, about a third of the time subjects in these studies could not tell real from fake headlines. Given the potential number of false headlines people encounter daily, this can result in massive misinformation.

What factors contributed to susceptibility to misinformation, or protected against it? One factor that many people may find surprising, but which I have seen many times over the years, is that education level alone conveyed essentially no benefit. This also aligns with the pseudoscience literature – education level (until you get to advanced science degrees) does not protect against believing pseudoscience. You might also (and I do) view this as a failure of the education system, which is supposed to be teaching critical thinking. This does not appear to be happening to any significant degree.

There were some strong predictors. People who have an analytical thinking style were more accurate on both counts – identifying true and false headlines, but with a bit of a false headline bias. This factor comes up often in the literature. An analytical thinking style also correlates with lower belief in conspiracy theories, for example. Can we teach an analytical thinking style? Yes, absolutely. People have a different inherent tendency to rely on analytical vs intuitive thinking, but almost by definition analytical thinking is a conscious deliberate act and is a skill that can be taught. Perhaps analytical thinking is the thing that schools are not teaching students but should be.

Older age also was associated with higher overall discrimination, and also with a false headline bias, meaning that their default was to be skeptical rather than believing. It’s interesting to think about the interplay between these two things – in a world with mostly false headlines, having a strong skeptical bias will lead to greater accuracy. Disbelieving becomes a good first approximation of reality. The research, as far as I can see, did not attempt to replicate reality in terms of the proportion of true to false headlines. This means that the false bias may be more or less useful in the real world than in the studies, depending on the misinformation ecosystem.

Also being a self-identified Democrat correlated with greater accuracy and also a false bias, while self-identifying as a Republican was associated with lower accuracy and a truth bias (tending to believe headlines were true). Deeply exploring why this is the case is beyond the scope of this article (this is a complex question), but let me just throw out there a couple of the main theories. One is that Republicans are already self-selected for some cognitive features, such as intuitive thinking. Another is that the current information landscape is not uniform from a partisan perspective, and is essentially selecting for people who tend to believe headlines.

Some other important factors emerged from this data. One is that a strong predictor of believing headlines was partisan alignment – people tended to believe headlines that aligned with their self-identified partisan label. This is due to “motivated reflection” (what I generally refer to as motivated reasoning).  The study also confirmed something I have also encountered previously – that those with higher analytical thinking skills actually displayed more motivated reasoning when combined with partisan bias. Essentially smarter people have the potential to be better and more confident at their motivated reasoning. This is a huge reason for caution and humility – motivated reasoning is a powerful force, and being smart not only does not necessarily protect us from it, but may make it worse.

Finally, the single strongest predictor of accepting false headlines as true was familiarity. If a subject had encountered the claim previously, they were much more likely to believe it. This is perhaps the most concerning factor to come out of this review, because it means that mere repetition may be enough to get most people to accept a false reality. This has big implications for the “echochamber” effect on both mainstream and social media. If you get most of your news from one or a few ideologically aligned outlets, you essentially are allowing them to craft your perception of reality.

From all this data, what (individually and as a society) should we do about this, if anything?

First, I think we need to seriously consider how critical thinking is taught (or not taught) in schools. Real critical thinking skills need to be taught at every level and in almost every subject, but also as a separate dedicated course (perhaps combined with some basic scientific literacy and media savvy). Hey, one can dream.

The probability of doing something meaningful in terms of regulating media seems close to zero. That ship has sailed. The fairness doctrine is gone. We live in the proverbial wild west of misinformation, and this is not likely to change anytime soon. Therefore, individually, we can protect ourselves by being skeptical, working our analytical thinking skills, checking our own biases and motivated reasoning, and not relying on a few ideologically aligned sources of news. One good rule of thumb is to be especially skeptical of any news that reinforces your existing biases. But dealing with a societal problem on an individual level is always a tricky proposition.

No responses yet