Oct 10 2024

Confidently Wrong

How certain are you of anything that you believe? Do you even think about your confidence level, and do you have a process for determining what your confidence level should be or do you just follow your gut feelings?

Thinking about confidence is a form of metacognition – thinking about thinking. It is something, in my opinion, that we should all do more of, and it is a cornerstone of scientific skepticism (and all good science and philosophy). As I like to say, our brains are powerful tools, and they are our most important and all-purpose tool for understanding the universe. So it’s extremely useful to understand how that tool works, including all its strengths, weaknesses, and flaws.

A recent study focuses in on one tiny slice of metacognition, but an important one – how we form confidence in our assessment of a situation or a question. More specifically, it highlights The illusion of information adequacy. This is yet another form of cognitive bias. The experiment divided subjects into three groups – one group was given one half of the information about a specific situation (the information that favored one side), while a second group was given the other half. The control group was given all the information. They were then asked to evaluate the situation and how confident they were in their conclusions. They were also asked if they thought other people would come to the same conclusion.

You can probably see this coming – the subjects in the test groups receiving only half the information felt that they had all the necessary information to make a judgement and were highly confident in their assessment. They also felt that other people would come to the same conclusion as they did. And of course, the two test groups came to the conclusion favored by the information they were given.

The researchers conclude (reasonably) that the main  problem here is that the test groups assumed that the information they had was adequate to judge the situation – the illusion of information adequacy. This, in turn, stems from the well documented phenomenon that people generally don’t notice what is not there, or at least it is a lot more difficult to notice the absence of something. Assuming they have all relevant information, it then seems obvious what the answer is – whichever position is favored by the information they are given. In fact, the test groups were more confident in their answers than the control group. The control group had to balance conflicting information, while the test groups were unburdened by any ambiguity.

There are some obvious parallels to the real world here. There is a lot of discussion about how polarized the US has become in recent years. Both sides appear highly confident that they are right, that the other side has lost their collective mind, and nothing short of total political victory and any cost will suffice. This is obviously a toxic situation for any democracy. Experts debate over the exact causes of this polarization, but there is one very common theme – the two sides are largely siloed in different “information ecosystems”. This is the echochamber effect. If you listen mainly or only to partisan news, then you are getting one half of the story, the half that supports your side. You will have the illusion that you have all the information, and in light of that information the conclusion is obvious, and anyone who disagrees must have dark motives, or be mentally defective in some way.

I have seen this effect in many skeptical contexts as well. After watching or reading a work that presents only half the story – the case for one side of a controversy – many people are convinced. They think they now understand the situation, and feel that such a large amount of information has to add up to something. I have had many discussions, for example, with people who have rad books like The Aquatic Ape, that argues that humans went through an evolutionary period of adaptation to an aquatic life. It’s all nonsense and wild speculation, without any actual science, but it’s hard not to be persuaded by a book-length argument if you don’t already have the background to put it into context. The same happened with many people who watched the movie Loose Change.

This is why it is a good rule of thumb to suspend judgement when you encounter such claims and arguments. Professionals in investigative professions learn to do this as part of their deliberate analytical process. What am I not being told? What information is missing? What do those who disagree with this position have to say? What’s the other side of the story?

This is a good intellectual habit to have, and is also a cornerstone of good skepticism. Who disagrees with this claim and why? In everyday life it is a good idea to have diverse sources of information, and in fact to seek out information from the “other side”. For political news, no one source can be adequate, although some sources are better than others. Not all news sources are equally partisan and biased. It’s a good idea to seek out news sources that are generally considered to be, and may have been rated, to be less partisan and are balanced in their reporting. But it is also a good idea to utilize multiple sources of news, and to specifically consume news that is of reasonable quality but comes from a different position than your own. What is the other side saying and why? It may be painful and uncomfortable sometimes, but that is a good reason to do it.

It’s good to know that there is a bias towards the illusion of information adequacy, because with that knowledge you can work against it. In the study, when the test subjects were given the other half of the information that they were initially missing, many of them did change their minds. This is something else we often see in psychological studies – humans are generally rational by default, and will listen to information. But this is true only as long as there is no large emotional stake. If their identity, tribe, ego, or fundamental world view is at stake, then rationality gives way to motivated reasoning.

This is why it is extremely useful (although also extremely difficult) to have no emotional stake in any claim. The only stake a rational person should have is in the truth. Your identity should be as an objective truth-seeker,  not as a partisan of any kind. Also there should be no loss in ego from being wrong, only from failing to change your mind in light of new evidence. This is a rational ideal, and no one achieves it perfectly, but it’s good to have a goal.

At least it’s good to be engaged in metacognition, and to think about your thought process and everything that might be biasing it. This includes information and perspective that might be missing. This is the most difficult to detect, so it requires special attention.

No responses yet