Nov 03 2017

Consistency Bias

“Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.”

– George Orwell

persistence-of-memory-486x309In Orwell’s classic book, 1984, the totalitarian state controlled information and they used that power to obsessively manage public perception. One perception they insisted upon was that the state was consistent – never changing its mind or contradicting itself. This desire, in turn, is based on the premise that changing one’s mind is a sign of weakness. It is an admission of prior error or fault.

Unsurprisingly our perceptions of our own prior beliefs are biased in order to minimize apparent change, a recent study shows. The exact reason for this bias was not part of the study.

Researchers surveyed subjects as to their beliefs regarding the effectiveness of corporal punishment for children. This topic was chosen based on the assumption that most subjects would have little knowledge of the actual literature and would not have strongly held beliefs. Subjects were then given articles to read making the case for the effectiveness or ineffectiveness of spanking (either consistent with or contrary to their prior beliefs), and then their beliefs were surveyed again.

Predictably the researchers found that after reading a text inconsistent with their prior stated beliefs, most subjects changed their minds. Their beliefs moved in the direction of the new information. This is consistent with the research on beliefs that are moderately held, and not necessarily beliefs that are strongly held or part of one’s identity.

Further, the researchers asked the subjects to indicate their prior beliefs on the effectiveness of spanking – this was the real focus of the study. They found that many subject misremembered their prior beliefs, biased in the direction of their current beliefs. Also, the greater the difference between prior and current beliefs the greater this bias in their memory for their prior beliefs.

Therefore, we will tend to underestimate the degree to which our minds have been changed by learning new information – at least for moderately held beliefs. Why does this happen?

Again, this study does not directly address the question of why, but here are some possibilities. The authors gave a straightforward interpretation, that we lack metacognitive awareness of our prior beliefs, and that our estimates of prior beliefs are affected by our current beliefs. Therefore it may be as simple as estimating prior beliefs based on current beliefs, without necessarily a deeper emotional cause.

However, we can certainly speculate that deeper emotional causes are present. It is possible that a disconnect between prior beliefs and current beliefs causes cognitive dissonance, and that we partly relieve that dissonance by simply adjusting our memory to reduce the magnitude of the change.

There may also be an element of impression management – we consciously and subconsciously attempt to manage other’s perception of ourselves. We do not like to appear inconsistent, and so will pretend our prior beliefs were closer to our current beliefs to minimize apparent inconsistency.

I do wonder how much of this apparent need to minimize the degree to which we have changed our mind is cultural. Even if cognitive dissonance in the face of change is the “default mode” of human psychology, it seems to me that it would be advantageous to alter this by culture and education.

In other words – we should cultivate an attitude in which changing one’s mind in the face of new information is not socially embarrassing, does not imply weakness, and is not something that needs to be covered over. In fact, gleefully altering one’s beliefs to accommodate new information should be a badge of honor, a sign that one is intellectually honest and courageous.

I also think that a related phenomenon is a willingness to suspend opinion or judgement. It is OK to say that we don’t know enough about a topic to have a strong opinion, or any opinion at all. This represents appropriate humility in the face of our own ignorance. No one can know everything, and admitting ignorance should not be shameful. Again, it represents intellectual honesty. Pretending knowledge one does not have is the real vice.

These traits should be cultivated, specifically taught, and celebrated – humility and honesty in the face of one’s own ignorance, and pride in the ability to appropriately change one’s mind in the face of new knowledge. Part of this will likely require cultivating the language necessary to express these positions – stating one’s opinions as tentative, acknowledging the limitations of our knowledge, and the potential depth of our ignorance.

It is possible, however, that even with a mature understanding of these intellectual virtues, our memories may still fail us. This study suggests that our memories for prior beliefs may themselves be inaccurate and biased in the direction of minimizing change. All we can do, therefore, is be aware of this bias in our memories and try to adjust for it. This may come up when other people have a different memory of our prior positions than our own. We need to acknowledge that their memories may be closer to the truth, because of this bias.

This is yet another reason to be suspicious of our own memories. Overall the research shows that our memories serve thematic goals first and foremost, and details are adjusted accordingly. Therefore, be humble in the knowledge of the fallibility of your own memory.

Of course, if new research shows something different, I will happily change my beliefs.

 

9 responses so far