Search Results for "bias"

May 28 2020

Confidence Drives Confirmation Bias

Published by under Neuroscience

Human thought processes are powerful but flawed, like a GPS system that uses broken algorithms to lead you to the wrong destination. Psychologists study these cognitive biases and heuristic patterns of thought to better understand these flaws and propose possible fixes to mitigate them. To a large degree, scientific skepticism is about exactly that – identifying and compensating for the flaws in human cognition.

Perhaps the mother of all cognitive biases is confirmation bias, the tendency to notice, accept, and remember information that confirms what we already believe (or perhaps want to believe), and to ignore, reject, or forget information that contradicts what we believe. Confirmation bias is an invisible force, constantly working in the background as we go about our day, gathering information and refining our models of reality. But unfortunately it does not lead us to accuracy or objective information. It drives us down the road of our own preexisting narratives.

One of the things that makes confirmation bias so powerful is that it gives us the illusion of knowledge, which falsely increases our confidence in our narratives. We think there is a ton of evidence to support our beliefs, and anyone who denies them is blind, ignorant, or foolish. But that evidence was selectively culled from a much larger set of evidence that may tell a very different story from the one we see. It’s like reading a book but making up your own story by only reading selective words, and stringing them together in a different narrative.

A new study adds more information to our understanding of confirmation bias. It not only confirms our selective processing of confirming information, it shows that confidence drives this process. So not only does confirmation bias lead to false confidence, that confidence then drives more confirmation bias in a self-reinforcing cycle.

Continue Reading »

No responses yet

Sep 23 2019

Outrage, Bias, and the Instability of Truth

One experience I have had many times is with colleagues or friends who are not aware of the problem of pseudoscience in medicine. At first they are in relative denial – “it can’t be that bad.” “Homeopathy cannot possibly be that dumb!” Once I have had an opportunity to explain it to them sufficiently and they see the evidence for themselves, they go from denial to outrage. They seem to go from one extreme to the other, thinking pseudoscience is hopelessly rampant, and even feeling nihilistic.

This general phenomenon is not limited to medical pseudoscience, and I think it applies broadly. We may be unaware of a problem, but once we learn to recognize it we see it everywhere. Confirmation bias kicks in, and we initially overapply the lessons we recently learned.

I have this problem when I give skeptical lectures. I can spend an hour discussing publication bias, researcher bias, p-hacking, the statistics about error in scientific publications, and all the problems with scientific journals. At first I was a little surprised at the questions I would get, expressing overall nihilism toward science in general. I inadvertently gave the wrong impression by failing to properly balance the lecture. These are all challenges to good science, but good science can and does get done. It’s just harder than many people think.

This relates to Aristotle’s philosophy of the mean – virtue is often a balance between two extreme vices. Similarly, I find there is often a nuanced position on many topics balanced precariously between two extremes. We can neither trust science and scientists explicitly, nor should we dismiss all of science as hopelessly biased and flawed. Freedom of speech is critical for democracy, but that does not mean freedom from the consequences of your speech, or that everyone has a right to any venue they choose.

A recent Guardian article about our current post-truth world reminded me of this philosophy of the mean. To a certain extent, society has gone from one extreme to the other when it comes to facts, expertise, and trusting authoritative sources. This is a massive oversimplification, and of course there have always been people everywhere along this spectrum. But there does seem to have been a shift. In the pre-social media age most people obtained their news from mainstream sources that were curated and edited. Talking head experts were basically trusted, and at least the broad center had a source of shared facts from which to debate.

Continue Reading »

No responses yet

Jul 19 2018

Developing Cognitive Biases in Young Children

Published by under Neuroscience

I have discussed a number of cognitive biases over the years, based mostly on research in adults. For example, Kahneman and Tversky first proposed the representativeness heuristic in 1973. But at what age do children start using this heuristic?

A heuristic is essentially a mental short cut. Such short cuts are efficient, and decrease our cognitive load, but they are imperfect and prone to error.  In the representativeness heuristic we rely on social information and ignore numerical information when making probability judgments about people.

In the classic experiment subjects were given a description of the personality of a student, designed to be a stereotype of an engineer. They were then asked how likely it was that the student was an engineering student. Many subjects answered that the student was likely an engineering student, without considering the base rate – the percentage of students who are in engineering. Even when given that information showing it was unlikely the student was an engineer, many subjects ignored the numerical information and based their judgments entirely on the social information.

This can also be seen in the context of general cognitive styles – intuitive vs analytical (or thinking fast vs thinking slow – as in the title of Kahneman’s book). Intuitive thinking is our gut reaction, it is quick and relies heavily on social cues and pattern recognition. It is therefore fast, but is also error prone and subject to a host of cognitive biases.

Continue Reading »

No responses yet

Nov 03 2017

Consistency Bias

Published by under Neuroscience

“Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.”

– George Orwell

persistence-of-memory-486x309In Orwell’s classic book, 1984, the totalitarian state controlled information and they used that power to obsessively manage public perception. One perception they insisted upon was that the state was consistent – never changing its mind or contradicting itself. This desire, in turn, is based on the premise that changing one’s mind is a sign of weakness. It is an admission of prior error or fault.

Unsurprisingly our perceptions of our own prior beliefs are biased in order to minimize apparent change, a recent study shows. The exact reason for this bias was not part of the study.

Researchers surveyed subjects as to their beliefs regarding the effectiveness of corporal punishment for children. This topic was chosen based on the assumption that most subjects would have little knowledge of the actual literature and would not have strongly held beliefs. Subjects were then given articles to read making the case for the effectiveness or ineffectiveness of spanking (either consistent with or contrary to their prior beliefs), and then their beliefs were surveyed again.

Continue Reading »

9 responses so far

Jun 01 2017

Confirmation Bias vs Desirability Bias

Published by under Neuroscience

donald-trump-vs-hillary-clinton-top-issuesThe human brain is plagued with cognitive biases – flaws in how we process information that cause our conclusions to deviate from the most accurate description of reality possible with available evidence. This should be obvious to anyone who interacts with other human beings, especially on hot topics such as politics or religion. You will see such biases at work if you peruse the comments to this blog, which is rather a tame corner of the social media world.

Of course most people assume that they are correct and everyone who disagrees with them is crazy, but that is just another bias.

The ruler of all cognitive biases, in my opinion, is confirmation bias. This is a tendency to notice, accept, and remember information which appears to support an existing belief and to ignore, distort, explain away, or forget information which seems to disconfirm an existing belief. This process works undetected in the background to create the powerful illusion that the facts support our beliefs.

If you are not aware of confirmation bias and do not take active steps to avoid it, it will have a dramatic effect on your perception of reality.  Continue Reading »

48 responses so far

May 04 2017

Free Speech Bias

Published by under Logic/Philosophy

Insubordinate man with zipped mouth

Free speech has been a hot issue recently, and probably always will be to some extent. This is likely because the stakes are high – free speech is a core liberty essential to any functional democracy. But in a society where you have to live with other people, liberty cannot be unlimited, because it will bump up against the liberty of others. So there needs to be some well-thought-out rules for how to resolve conflicts.

How a society balances the need for free speech with the need to protect people from defamation, fraud, oppression, and harassment says a lot about the character of that society. In the US we have constitutionally chosen to err on the side of free speech, and I think this is appropriate. The courts give people a wide berth to have freedom of expression, and understands that the very speech that needs defending is speech that someone finds offensive.

At the same time, freedom from having your public speech repressed does not translate into a right to access to any venue at any time. The New York Times is not obligated to publish your 10-page manifesto.

The real purpose of this post, however, is not to delve into the nuances of free speech but to discuss how individual people decide on those nuances. This was illuminated by a recent study, the results of which I find entirely unsurprising. This is in line with the general findings of psychological studies.

Continue Reading »

919 responses so far

Feb 09 2017

The Super Bowl and Hindsight Bias

Published by under Logic/Philosophy

Brady SB51Full disclosure – I have been a Patriots fan since in the 1980s. I suffered through a couple long decades of rooting for a mediocre team, including the worst (at the time) Super Bowl defeat at the hands of the Bears. Then along came Belichick and Brady, and it has been a wild ride as a fan.

Super Bowl LI was perhaps the pinnacle – the Patriots came back from a 25 point deficit to tie the game and then win in sudden-death overtime. I feel genuinely bad for Falcons fans, but perhaps worse for those who stopped watching the game in the third quarter because they thought it was over. Those who stayed through to the end were rewarded with historically epic football.

(As an aside, I am a fan simply because it is fun to have a team to root for. Don’t read too much into it.)

What is interesting, from a critical thinking perspective, about the game is the way in which we construct narratives to explain random events, or at least events that have an element of randomness or “luck” involved. At half-time the Falcons were up 21-3 and the discussion among the commentators was all about how well the Falcons were playing and everything the Patriots were doing wrong. The Falcons had “momentum” and the Patriots had to figure out a way to steal this elusive “momentum” back.

Continue Reading »

55 responses so far

Jan 13 2017

Cognitive Biases in Health Care Decision Making

Published by under Logic/Philosophy

decision-makingThis was an unexpected pleasant find in an unusual place. The Gerontological Society of America recently put out a free publication designed to educate patients about cognitive biases and heuristics and how they can adversely affect decision making about health care.

The publication is aimed at older health care consumers, but the information it contains is applicable to all people and situations. It is a well written excellent summary of common cognitive biases with a thorough list of references. There are plenty of other resources that also review this material, including my own Teaching Company course, but this is a good user-friendly reference.

What is most encouraging about this publication is the simple fact that it recognizes that this is an issue. It is taking knowledge of psychology and applying it to the real world, recognizing the specific need for critical thinking skills in the public. This could have easily been produced in many different contexts – not only any medical specialty, but investing your money, buying a home, choosing a college, or evaluating news reports.

The report is aimed simultaneously at health care providers and patients. It is primarily a guide for providers for communicating with older adults, accounting for cognitive biases in decision-making, but at the same time will help consumers communicate with their providers and make better decisions.

Continue Reading »

209 responses so far

Jun 01 2015

Citation Bias – Confirmation Bias for Scientists

Published by under General Science

I’m a big fan of science for many reasons. Not only is the subject matter of science often incredibly interesting, but the process of science seems to work better than any other method humans have developed for knowing about the universe in which we live. Any fair-minded and knowledgeable view of human history cannot avoid this conclusion.

It’s therefore worthwhile thinking about and exploring the science of science itself, what we might call metascience. It is, in fact, a common narrative among skeptics and science communicators that, while science is awesome, it is practiced by biased and flawed humans. The history of science is one of error, bias, and ego that manages to slowly grind toward the truth.

Metascience is as important as metacognition, or thinking about thinking, and I write about both topics often. These are core knowledge-bases for any critical thinking skeptic. Here is a list I compiled of the most important issues with the quality of science. The goal here is not to criticize science, but to improve its practice, make it more efficient, minimize wasted resources, and help the public sift the reliable from the nonsense.

I’m now going to add another important concept to the list – citation bias.

Continue Reading »

3 responses so far

May 02 2014

Framing Bias

Published by under Neuroscience

Let’s say you need a surgical procedure and the surgeon tells you there is a 98% survival rate with the procedure. How would you feel about that? What if she told you there was a 2% mortality rate? Would you feel the same way? Probably not, according to years of psychological research.

This is known as framing bias, just one more of the many ways in which our brains are biased in the way we evaluate information. The two scenarios above are identical, but statistically people will make different decisions based upon how the information is framed. We generally respond better to positively framed information (98% survival) than to negatively framed information (2% mortality).

The framing effect is often exploited by those who are deliberately trying to manipulate our reactions. Politicians, for example, can talk about employment rates or unemployment rates. Events can give you an early-bird discount or a late registration penalty. Products can have 4% fat or be 96% fat free.

Framing is another way in which we construct our picture of realty, by deciding what information is important.

Continue Reading »

8 responses so far

Next »