Mar 27 2007

The Well-Insulated Belief

Scientific skepticism, at its core, is about good cognitive hygiene – having respect for the evidence, using valid logic and avoiding fallacies, and having respect for the various mechanisms of self-deception and pitfalls of human thinking. These are all generic and undeniable intellectual virtues – not unique to science or skepticism. No one, not even the most strident true believer, openly advocates the use of logical fallacies or unsound arguments. Gullibility and true belief, rather, result from a significant lack of understanding of these common mental foibles.

Combined with errors in thinking is a profound human need to believe. The result is the tendency to arrive at beliefs for cultural or emotional reasons, and then to commit errors in thinking in order to defend and maintain those beliefs. The overall effect is to insulate belief from falsification.

Susan Blackmore equates such belief to mental viruses – which she calls memes. She argues that beliefs (memes) have undergone Darwinian evolution over the course of human cultural history. Those beliefs that are more psychologically attractive have a greater rate of infection. Further, those beliefs that better insulate themselves from disproof have a greater survival rate. Some beliefs incorporate elements that anticipate future attacks and protect against them (like HIV attacking the immune system). Others make themselves more infective by including mechanisms for their own spread – the commandment to proselytize. I think the analogy is apt – there is differential survival for different versions of different beliefs, adaptive radiation, variation, mutation, replication, etc. All the elements of an evolving system are there, and the selective pressures are obvious.

The cure for mental viruses is good cognitive hygiene and inoculation with all the tools of skepticism.

I want to focus on one aspect of this contest between belief and skepticism – that of insulating beliefs against falsification. This is not a specific fallacy, but rather the end result of many fallacies, often working in concert. It is extremely common – even ubiquitous, something we all need to understand and guard against.

Subjective Validation and Confirmation Bias

These are the most common tools of belief insulation, and we all commit these errors to some extent. Subjective validation essentially means that we rely upon subjective (rather than quantitative or objective) assessment in order to validate a claim or belief. This is often used in reference to cold readings: the cold reader may ask, “Does the letter M have any significance for you?” and the subject might validate this statement by searching for any correlation – “Yes, my brother’s wife’s mother is named Mary.” Subjective validation can seem very compelling – most people are very impressed with the fact that they can find a correlation that appears to validate a statement or claim. This can lead people down the road to a firmly held belief that is entirely wrong. The problem with subjective validation is that is uses open-ended and subjective criteria, so it can be used to essentially validate anything, which means that it really validates nothing.

Confirmation bias means that we seek out and remember information that confirms what we already believe, and we do not seek out, we fail to notice, or we rationalize away information that contradicts our beliefs. Again, this can have a very powerful psychological effect on our beliefs, because it can make us feel that the evidence overwhelmingly supports our position.

Logical Fallacies

A number of logical fallacies can be employed to render a belief immune to disproof. My favorite is the “no true Scotsman” argument – a method for rejecting disconfirming evidence. The form it takes goes like this: All Scotsman play golf. But I have a friend who is Scottish and he has never played golf. Well (here comes the rationalization), no true Scotsman has never played golf.” This is a form of recursive logic, or circular reasoning. It essentially defines the group as having the quality that it is claimed the group has, and using this to say that any member of the group without the quality is, by definition, not truly a member of the group. QED.

Special pleading is the all purpose logical fallacy for defending belief systems. This is simply the strategy of inventing a new and special reason to explain away any disconfirming evidence or lack of confirming evidence. Each claim may, of itself, be plausible and even reasonable. The fallacy comes from the endless and open ended way in which new excuses are invented as needed to defend one position.

Many beliefs incorporate a “get out of jail free” card in their repertoires. This is an all-purpose belief element that can be used to deal with any inconvenient facts. Creationists, for example, maintain that God created the world the way he wanted to, unencumbered by anything as mundane as the physical laws of the universe. Therefore, any and all evidence can be dismissed with the all-purpose explanation of – God wanted it that way. Since God can do anything, anything can be explained as simply the will of God. That’s total immunity.

More subtle versions of this abound as well. I have had conversations with extreme Libertarians (I don’t have anything against Libertarians, it’s just an example) who defend the notion that the free market is the best at everything. When I give an example of a free market that did not appear to work, their answer is, “Well, that wasn’t a truly free market,” or “if there were no government taxation or costly regulation the market would have sufficient money to solve X” – “X” being any objection you can think to raise. Again, in any specific case the argument may in fact be valid, but when used ad hoc to brush aside any and all contradictory evidence, it becomes a mechanism for belief-insulation.

Conspiracy Thinking

Conspiracy theorists are the masters at insulating their beliefs. The conspiracy model in itself incorporates the necessary elements to render any belief immune to contradiction. Any and all evidence that contradicts the conspiracy theory can be explained away as a fabrication of the conspirators. Any lack of evidence can likewise be explained as a cover up. There is therefore no evidence or lack thereof that can disprove the conspiracy theory.

They also have their get out of jail free cards. Any apparent contradiction or implausibility with a conspiracy can be handled by simply seeding more power and reach to the conspirators. Why hasn’t any one of the hundreds of people who must have been privy to the conspiracy come forward? Because “they” got to them – “they” used their power and influence to intimidate and silence them. When a conspiracy does not make sense on one level – just deepen it. Hey, we don’t know how deep the rabbit hole goes. This is why grand conspiracies over time evolve toward grander and deeper conspiracies, until you end up with an omnipotent cabal running the entire world (Men in Black, the Illuminati).

An excellent recent example of this was heard on the Penn radio show when moon hoax conspiracy theorist Joe Rogan was debating Phil Plait, the Bad Astronomer. Phil asked a very reasonable question – how come the Soviet Union, who certainly must have known that we couldn’t and didn’t go to the moon, didn’t expose the hoax? Why did they support it? Joe’s answer was essentially that he doesn’t know who really controls both these countries. Perhaps the real powers that be are in charge of both America and the Soviet Union. Just deepen the conspiracy, and all apparent contradictions magically go away. How come the liberal press hasn’t proven the Bush Administration orchestrated 9/11? Because they are in on it too.

So my lesson for the day is to practice good cognitive hygiene, and be wary of those mental habits that tend to reinforce what you already believe and want to believe. Part of good practice is to occasionally play devil’s advocate – try to honestly knock down your own belief. Ask yourself – what do we really objectively know? And of course you have to understand fallacies and pitfalls in order to avoid them, and the skeptical community is the best place to learn them.

But this could just be an attempt by the skeptical meme to replicate itself and spread.

No responses yet