Sep 23 2019

Outrage, Bias, and the Instability of Truth

One experience I have had many times is with colleagues or friends who are not aware of the problem of pseudoscience in medicine. At first they are in relative denial – “it can’t be that bad.” “Homeopathy cannot possibly be that dumb!” Once I have had an opportunity to explain it to them sufficiently and they see the evidence for themselves, they go from denial to outrage. They seem to go from one extreme to the other, thinking pseudoscience is hopelessly rampant, and even feeling nihilistic.

This general phenomenon is not limited to medical pseudoscience, and I think it applies broadly. We may be unaware of a problem, but once we learn to recognize it we see it everywhere. Confirmation bias kicks in, and we initially overapply the lessons we recently learned.

I have this problem when I give skeptical lectures. I can spend an hour discussing publication bias, researcher bias, p-hacking, the statistics about error in scientific publications, and all the problems with scientific journals. At first I was a little surprised at the questions I would get, expressing overall nihilism toward science in general. I inadvertently gave the wrong impression by failing to properly balance the lecture. These are all challenges to good science, but good science can and does get done. It’s just harder than many people think.

This relates to Aristotle’s philosophy of the mean – virtue is often a balance between two extreme vices. Similarly, I find there is often a nuanced position on many topics balanced precariously between two extremes. We can neither trust science and scientists explicitly, nor should we dismiss all of science as hopelessly biased and flawed. Freedom of speech is critical for democracy, but that does not mean freedom from the consequences of your speech, or that everyone has a right to any venue they choose.

A recent Guardian article about our current post-truth world reminded me of this philosophy of the mean. To a certain extent, society has gone from one extreme to the other when it comes to facts, expertise, and trusting authoritative sources. This is a massive oversimplification, and of course there have always been people everywhere along this spectrum. But there does seem to have been a shift. In the pre-social media age most people obtained their news from mainstream sources that were curated and edited. Talking head experts were basically trusted, and at least the broad center had a source of shared facts from which to debate.

Social media, however, democratized the production and curating of information. The Guardian points out that it also gave the public more direct access to information. This creates the powerful illusion of knowledge – I have a world of facts at my finger tips, with no middle-man to get in the way or decide which facts to feed me. Trust in experts and authority has collapsed. This also played directly into human psychology. Dunning (of Dunning-Kruger fame) pointed out:

An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge.

Social media was basically jet fuel to that process. Now, who needs experts? Or – you can choose your own experts, who are all equally accessible online, not relegated to the fringe of conspiracy tabloids. The illusion of knowledge is now more powerful than ever.

There is no going back – the social media genie is out of the bottle. People now have more direct access (or at least what feels like direct access) to information, and they are not going to give that up and have Walter Cronkite tell them what the news is. But we can get to a more nuanced position in the middle. Sure, experts are biased and sometimes wrong; journalists are imperfect, they make mistakes, they go for sensationalism and click-bait. People will, like powerful corporations, use their money to influence the system.

But – this does not mean the system is necessarily rigged to such an extent that it does not work at all. What’s interesting, as the Guardian article points out, is that people from all across the political and belief spectrum (essentially everyone) thinks the system is rigged, but always against them. The left thinks the system is rigged against them, and so does the right, and so do the moderates. They can’t, of course, all be right. But they can all be partially right, and partially wrong. There are biases in the system, but there are biases in every direction. This doesn’t mean there aren’t problems or imbalances that need to be fixed. What it does mean is that you cannot dismiss any fact you find inconvenient as fake news promoted by biased experts working for the opposition.

Unfortunately the solution requires a lot of effort. It is not only a mean sitting between two extremes, the mean is on top of a hill and is easily pushed down hill in either direction. It is a high-energy state that requires work to maintain.

That work involves the various processes of critical thinking. We have to evaluate experts, authorities, and claims based upon objective criteria – facts and logic. But more than this (because even flat-earthers think they do this) we need to step back from our beliefs and our own biases and try to chart as objective a path as possible. We have to try to prove ourselves wrong. We need to divorce our own identities and sense of worth and tribalism from any particular conclusion, and take pride instead in the validity of the process.

We need to do all these things and a hundred more, because critical thinking is not easy. That’s why, I think, many people tend to fall for the low energy state of the extremes. It’s easy to have a simple narrative that explains everything, for your side to be always right and the other side to be always wrong. It’s easy to think of the other side as being evil, because then you don’t have to take their perspective seriously. You never have to admit that perhaps they have a point, even if you disagree with their conclusion. You don’t have to confront the perhaps uncomfortable truth that there are different value systems that are equally valid. You don’t have to work to understand the other side, you can simply rail against the simplistic cartoon you have been presented with.

You also get to feed off the emotions of outrage, anger, and self-righteousness. And you never have to experience the negative emotions associated with cognitive dissonance, of confronting information or opinions which conflict with your own.

In the past all of this was still necessary, or at least optimal, but the demands on the average person were a lot less. In essence a lot of the work was done for us by institutions. The information Cronkite spoke every night was carefully vetted and researched, and editors curated the information for relevance and importance. It wasn’t perfect, and the downside was that everyone was subject to the perspective, biases, and agenda of the mainstream media. But here’s the thing – everyone is still subject to the biases, perspective, and agenda of the media, but now it is a million small media outlets. The main difference is that people have a more powerful illusion that they are well informed, that they are peeking behind the curtain. They are still largely regurgitating narratives that someone else gave to them, but they feel more empowered.

This is what leads people to confidently conclude that the world is flat. It is an intellectually low energy state at the bottom of a steep hill.

No responses yet