Nov 03 2017

Consistency Bias

“Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.”

– George Orwell

persistence-of-memory-486x309In Orwell’s classic book, 1984, the totalitarian state controlled information and they used that power to obsessively manage public perception. One perception they insisted upon was that the state was consistent – never changing its mind or contradicting itself. This desire, in turn, is based on the premise that changing one’s mind is a sign of weakness. It is an admission of prior error or fault.

Unsurprisingly our perceptions of our own prior beliefs are biased in order to minimize apparent change, a recent study shows. The exact reason for this bias was not part of the study.

Researchers surveyed subjects as to their beliefs regarding the effectiveness of corporal punishment for children. This topic was chosen based on the assumption that most subjects would have little knowledge of the actual literature and would not have strongly held beliefs. Subjects were then given articles to read making the case for the effectiveness or ineffectiveness of spanking (either consistent with or contrary to their prior beliefs), and then their beliefs were surveyed again.

Predictably the researchers found that after reading a text inconsistent with their prior stated beliefs, most subjects changed their minds. Their beliefs moved in the direction of the new information. This is consistent with the research on beliefs that are moderately held, and not necessarily beliefs that are strongly held or part of one’s identity.

Further, the researchers asked the subjects to indicate their prior beliefs on the effectiveness of spanking – this was the real focus of the study. They found that many subject misremembered their prior beliefs, biased in the direction of their current beliefs. Also, the greater the difference between prior and current beliefs the greater this bias in their memory for their prior beliefs.

Therefore, we will tend to underestimate the degree to which our minds have been changed by learning new information – at least for moderately held beliefs. Why does this happen?

Again, this study does not directly address the question of why, but here are some possibilities. The authors gave a straightforward interpretation, that we lack metacognitive awareness of our prior beliefs, and that our estimates of prior beliefs are affected by our current beliefs. Therefore it may be as simple as estimating prior beliefs based on current beliefs, without necessarily a deeper emotional cause.

However, we can certainly speculate that deeper emotional causes are present. It is possible that a disconnect between prior beliefs and current beliefs causes cognitive dissonance, and that we partly relieve that dissonance by simply adjusting our memory to reduce the magnitude of the change.

There may also be an element of impression management – we consciously and subconsciously attempt to manage other’s perception of ourselves. We do not like to appear inconsistent, and so will pretend our prior beliefs were closer to our current beliefs to minimize apparent inconsistency.

I do wonder how much of this apparent need to minimize the degree to which we have changed our mind is cultural. Even if cognitive dissonance in the face of change is the “default mode” of human psychology, it seems to me that it would be advantageous to alter this by culture and education.

In other words – we should cultivate an attitude in which changing one’s mind in the face of new information is not socially embarrassing, does not imply weakness, and is not something that needs to be covered over. In fact, gleefully altering one’s beliefs to accommodate new information should be a badge of honor, a sign that one is intellectually honest and courageous.

I also think that a related phenomenon is a willingness to suspend opinion or judgement. It is OK to say that we don’t know enough about a topic to have a strong opinion, or any opinion at all. This represents appropriate humility in the face of our own ignorance. No one can know everything, and admitting ignorance should not be shameful. Again, it represents intellectual honesty. Pretending knowledge one does not have is the real vice.

These traits should be cultivated, specifically taught, and celebrated – humility and honesty in the face of one’s own ignorance, and pride in the ability to appropriately change one’s mind in the face of new knowledge. Part of this will likely require cultivating the language necessary to express these positions – stating one’s opinions as tentative, acknowledging the limitations of our knowledge, and the potential depth of our ignorance.

It is possible, however, that even with a mature understanding of these intellectual virtues, our memories may still fail us. This study suggests that our memories for prior beliefs may themselves be inaccurate and biased in the direction of minimizing change. All we can do, therefore, is be aware of this bias in our memories and try to adjust for it. This may come up when other people have a different memory of our prior positions than our own. We need to acknowledge that their memories may be closer to the truth, because of this bias.

This is yet another reason to be suspicious of our own memories. Overall the research shows that our memories serve thematic goals first and foremost, and details are adjusted accordingly. Therefore, be humble in the knowledge of the fallibility of your own memory.

Of course, if new research shows something different, I will happily change my beliefs.


9 responses so far

9 thoughts on “Consistency Bias”

  1. Lobsterbash says:

    I wonder to what extent the participants of the study weren’t misremembering their prior beliefs as much as not really having a developed opinion on the matter, providing an answer because they had to pick just one, and then latching onto the new, provided information which is priming their brain?

    Stated another way, it’s possible that a person’s fuzzy opinion on something like this might have pros and cons behind their thinking, and perhaps on the matter of spanking many of the participants were wondering for themselves if it’s harmful to children, developmentally, but weren’t convinced. If confirmed, it might make sense that people would remember their skepticism on the matter and pick that as being their thought process.

    I definitely agree with your prescriptions in the 2nd half of the blog post, but am wondering about the merits of extrapolating from the study.

  2. Nidwin says:

    “This is yet another reason to be suspicious of our own memories. Overall the research shows that our memories serve thematic goals first and foremost, and details are adjusted accordingly. Therefore, be humble in the knowledge of the fallibility of your own memory.”

    Go tell this to our kin out there and good luck with the
    “my memory is fine, it’s you that have a problem and should see a doctor”

    I’ve no issues admitting when I’m wrong because I didn’t remember stuff properly or had a different and wrong idea on a specific subject. But I’m a tear in an ocean as most folks I know would blantantly lie in my face.

  3. Art Eternal says:

    See Salvador Dali’s, The Persistence of Memory and The Disintegration of the Persistence of Memory for clarity.

  4. tmac57 says:

    This may tie into something I was thinking about the other day concerning what is the best approach in presenting evidence to support your argument to another person who is well entrenched in their own position?
    I had this notion that it might be better to just point the person in the direction of the evidence without too strong of an expressed opinion about whether or not it would knock down their view. I had a feeling that I might be causing them too much discomfort, to the point of pushing them to become even more dug in, and less likely to accept the disconfirming information.
    If this study is true, that would seem to support that approach, as it would give people some space to take in the new evidence, sit with it awhile, and then accept it without the fear of letting go of their ‘consistency’. Kind of a face saving move.

  5. BaS says:

    I wonder whether this effect would be weaker in people who make a habit of exercising scientific skepticism and try to cultivate neuro-psychological humility in themselves. Or would we find no difference, that it’s a reflex we don’t route around because it’s happening beneath our notice?

  6. daedalus2u says:

    I have noticed this in myself, that when I update an understanding of something, my previous understanding slips away and is lost.

    I see this as beneficial. I don’t want to hold onto old and outdated understandings when something better and more reliable is available.

    Usually I don’t notice when the updating happens. Usually I can only notice it when the updating is actually going on; usually only when I have updated an understanding in one area and have not thought about the implications of that in other areas. Then when I start thinking about the other areas, they start updating as I am thinking about them. I might be able to suppress the updating if I wanted to, but I don’t want to do that.

    I think this is the same kind of thing that happens with people like YECs, when they are confronted with an argument. They may accept point after point, but when the conclusion threatens their YEC belief, they “update” and reject previously accepted points to “rescue” their YEC conclusions.

  7. BillyJoe7 says:

    SN: “Unsurprisingly our perceptions of our own prior beliefs are biased in order to minimize apparent change”

    Somehow I can’t seem to apply this to my own case.

    I clearly remember being a religious fanatic in my youth. I could go into details but I’ll avoid the embarrassment. I also believed absolutely in levitation and wasted a lot of time trying to achieve this state. I once thought mysticism was a going thing but could never understand the numerous books I read on the subject (for good reason, as I now realise). In more recent history, I dismissed climate change, and distrusted GMOs. On the other hand, I was never a creationist, but once exposed to the modern theory of evolution I immediately realised how ingeniously explanatory is was about life on Earth. Of course, I had lost my religious fanaticism by then and, in fact, any belief in gods and afterlives.

    I think it is important to remember where you came from and how you got to here from there.

  8. expblast says:

    I too remember every shift in beliefs I’ve had over many decades. From religious conservative to liberal atheist to moderate atheist (the non believing part seems to be a stalwart). I’m glad I didn’t forget those things. It helps me with understanding the people still entrenched by the beliefs I once held. I think its less about the believing and more about the methodology of obtaining the belief system. That is where skepticism is key.

  9. Maculus says:

    In these 2 examples, it seems to me that you changed identity along with your opinions. I don’t have any expertise here, but I think most people don’t want to feel that their identity has changed. When people are proud of the change, then I’d think they’d welcome the idea that their opinions changed.

Leave a Reply