Search Results for "bias"

May 27 2021

Red Flags of a Crank Study

The pandemic has brought into sharp focus the potential danger of misinformation. There are times when we need to act collectively as a society to accomplish certain goals. This is particularly challenging in a society that is organized around a principle of individualism – a principle I endorse and value. Liberty is a precious right to be jealously defended. But it is not the only right, or principle of value. So at times we have to delicately balance various competing interests. I like my freedom, but I also really like not catching a deadly disease, or spreading it to my family.

In a perfect world (one we definitely do not live in) there would be no need for restrictive or draconian measures. All that would be necessary was distributing information – hey, if you want to protect yourself and others, wear a mask, socially distance, wash your hands, and get vaccinated. If you’re really interested, here are the facts, the published studies, the expert analysis, to back up these recommendations. Here are the error bars and level of uncertainty, the risk vs benefit analysis, and comparison to other options.

This approach is necessary, and works to a degree, but it is insufficient. There are two main shortcomings of the information approach. First, people are only semi-rational beings, not Vulcans. We are susceptible to tribalism, motivated reasoning, confirmation bias, and a host of cognitive biases, faulty heuristics, and logical fallacies. Our intuitions about balancing risk and benefit are also flawed, and we have a hard time dealing with very large numbers. Just peruse the comments to any blog post on this site that is even slightly controversial and you will find copious examples of every type of flawed thinking.

Continue Reading »

No responses yet

Apr 26 2021

Political Polarization is Exaggerated

Published by under Skepticism

Humans are tribal by nature. We tend to sort ourselves into in-groups and out-groups, and then we engage in confirmation bias and motivated reasoning to believe mostly positive things about our in-group, and mostly negative things about the out-group. This is particularly dangerous in a species with advanced weaponry. But even short of mutual annihilation, these tendencies can make for a lot of problems, and frustrate our attempts at running a stable democracy.

Part of this psychological tribalism is that we tend to exaggerate what we assume are the negative feelings of the other group toward our own. Prior research has shown this, and a new study also demonstrates this exaggerated polarization and negativity. There are a few reasons for this. One is basic tribal psychology as outlined above. Other cognitive biases, like oversimplification and a desire for moral clarity, motivate us to craft cardboard strawmen out of our political opponents. We come to assume that their position is either in bad faith, and/or is simplistic nonsense. We tend to ignore all nuance in our opponent’s position, fail to consider the justifiable reasons they may have for their position and the commonality of our goals. Ironically this view is simplistic and may motivate us to act in bad faith, which fuels these same beliefs about us by the other side, creating a cycle of radicalization.

This process is helped along by the media, both traditional and social media. Social media tends to form echochambers where our radicalized simplistic view of the “other side” can become more extreme. Also, impersonal online interactions (just read the comments here) may allow us to engage with the cardboard fiction in our minds rather than the real person at the other end.

Traditional media contributes to this phenomenon by focusing on issues in conflict at the expense of issues where there is more consensus and commonality. The media likes conflict, and this give everyone a distorted view of how much polarization there actually is.

Continue Reading »

No responses yet

Apr 15 2021

Paul Thacker Trolling Skeptics on Vaccines

Published by under Skepticism

As the COVID vaccine rollout continues at a feverish pace, the occurrence of rare but serious blood clots associated with two adenovirus vaccines, AstraZeneca and Johnson & Johnson, is an important story, and should be covered with care and thoughtfulness. I have followed this story on Science Based Medicine here, here, and here. There is a lot of nuance to this issue, and it presents a clear dilemma. The ultimate goal is to optimally balance risk vs benefit while we are in the middle of a surging pandemic and while information is preliminary. This means we don’t panic, we consider all options, and we investigate thoroughly and transparently. There is a real debate to be had about how best to react to these rare cases, and as a  science communicator I have tried to present the issues as reasonably as possible.

But we no longer live in an age where most people get most of their science news from edited science journalists. Most get their news online, from a range of sources, some good, some bad, some acting in bad faith or filtered through an intense ideological filter, and many just trolling. There are even “pseudojournalists” out there, reporting outside any kind of serious review. One such pseudojournalist is Paul Thacker, who recently decided he had to criticize the reporting of “skeptics” on the COVID vaccine blood clotting issue.

For background, Thacker was fired from the journal Environmental Science & Technology for showing an anti-industry bias. Bias is a bad thing in journalism, the core principle of which is objectivity. I have no idea if Thacker honestly believes what he writes or if he can’t resist trolling, but it doesn’t really matter. He has espoused anti-GMO views to the point of harassing GMO scientists, leading Keith Kloor to call him a “sadistic troll”. Thacker has also promoted 5g conspiracy theories.

Continue Reading »

No responses yet

Apr 01 2021

EM Drive Failure

There are many times as a skeptic that I wish I were wrong. I really want to detect an alien artifact, and would love free energy, cold fusion, and a cure for cancer. I completely understand why these ideas have endless allure and the temptation to engage in a small bit of motivated reasoning to see the science from a particular, if odd, angle. But science does not progress this way. It progresses through the cold and heartless removal of error, by brutally smashing the pillars of our own vanity, fear, and desires, and by controlling for our own biases and shortcomings. I often refer to the peer-review process as a meat-grinder – it chews up and spits out ideas, but there is a product at the end – and that goes right back into the meat-grinder for another round.

One more really tempting idea now bites the dust – the EM Drive. I first wrote about this almost seven years ago. The idea is to create propellantless propulsion. This would revolutionize space travel, and could potentially even create that flying car we always wanted. Now, in the world of physics, in order to accelerate something there needs to be a force acting on it. If you want a rocket to go up, then you need to throw some mass from the rocket down so that the mass and velocities match (equal and opposite). So rockets need propellant, something to throw out their back. Ideally this is something very light that gets accelerated to really high speeds to produce the maximal thrust to the rocket.

While this concept works just fine, it is also extremely limiting, by something known as the rocket equation. The rocket needs to carry enough fuel to accelerate the entire rocket, including all the fuel it is carrying. So it needs fuel to carry the fuel to carry the fuel… This means there is a geometric rather than linear relationship between speed and range and how big a rocket and its fuel has to be. For many chemical rockets the fuel is the propellant; when you ignite it the fuel rapidly heats and expands and gets pushed out the exhaust. Other rocket designs may have a separate energy source and propellant. Ion drives, for example, create energy to power magnets which accelerate charged particles to extreme velocities.

But what if you did not need propellant? What if all you needed was energy, and could somehow use that energy to create thrust without having to throw any matter out the back end? That would drastically alter the rocket equation. This would reduce the cost of space travel and open up the solar system. It might even make it practical to get to nearby stars – in a hundred years we might have a fusion powered ship that can zip around the galaxy at a constant 1G acceleration.

Continue Reading »

No responses yet

Feb 15 2021

We Are All Conspiracy Theorists

Published by under Conspiracy Theories

I have often said, we all have a little conspiracy theorist inside of us. By this I mean that we all have some common psychological features that can lend themselves to believing in conspiracies. Some, of course, more than others. Going down a conspiracy rabbit hole is a tendency we may have to fight against. There has to be a point where we say to ourselves, wait a minute, can this actually be true? What is the evidence? Am I just fooling myself, giving in to my prejudices, or going along with my tribe? We all have a little skeptic inside of us as well, and at some point one wins over the other.

Conspiracy researcher Asbjørn Dyrendal, a professor in NTNU’s Department of Philosophy and Religious Studies, agrees. He has found that if you ask subjects about enough conspiracies, everyone eventually endorses belief in some conspiracy. But there is, of course, a matter of degrees. Dyrendal thinks everyone believes in a conspiracy “a little”. Not everyone believes in a so-called grand conspiracy, or has made one or more grand conspiracies the center of their beliefs.

There is a general tendency, however, to accept some beliefs not based upon rigid logic and evidence, but because it fits with our biases:

We are all more vulnerable to believing what we think is right, especially when our identity is at stake and emotions are strong. It can be a bit like the emotions associated with football.

By “football” he means soccer (for my American readers), but it doesn’t matter for the analogy. Any sports fan has experienced this – your team is better and more deserving. The other team is lucky, playing dirty, and the referee’s are unfairly calling things in their favor. It’s not absolute, but it is clearly a bias, and the more of a fan you are of your team, the more your identity and emotions are attached to their victory or loss, the more biased you are likely to be.

Continue Reading »

No responses yet

Feb 04 2021

Is Dunning-Kruger a Statistical Artifact?

Published by under Neuroscience

The short answer to the headline question is – not really, but it’s complicated.

The Dunning-Kruger effect, which I have written about several times before, was first published in 1999 by the named psychologists. The basic effect is this – if you graph self-perception of knowledge in a specific domain and performance on an objective test of that knowledge, there is a typical graph of the relationship between these two things. Specifically, the less people know, the more they overestimate their knowledge. They still rate themselves lower than people who know more, but the gap between perception and reality grows. Further, at the high end (the top quartile) people actually underestimate their relative knowledge, probably because they overestimate average knowledge in the public. And everyone thinks they are above average.

This effect is extremely robust and has been replicated many times in many contexts. As the authors have emphasized before – the DK effect is not about stupid people, it is about everybody. It is not about intelligence, but knowledge.

There is also a distinct effect some are calling a super-DK effect in which in specific knowledge areas, like genetic engineering, the people who low the least think they know the most. This is not just about knowledge, but about misinformation. If people are actively misinformed they will have the illusion of knowledge.

The DK effect has been a cornerstone of science communication and our understanding of how to improve knowledge in the last two decades. However, a recent study calls the basic effect into question – The Dunning-Kruger effect is (mostly) a statistical artefact: Valid approaches to testing the hypothesis with individual differences data. The study essentially showed you can reproduce the DK graph using a randomly generated set of data. How is this possible?

Continue Reading »

No responses yet

Dec 11 2020

Skeptical of Skepticism regarding Medical Skepticism

Published by under Skepticism

In a recent article in Medpage Today, Vinay Prasad offered his critiques of what he calls “medical skepticism”. Essentially he is talking about Science-Based Medicine and all my colleagues who engage in related activities. I am always open to criticism and love to engage about these topics. Unfortunately, Prasad’s criticism’s were based largely on his ignorance of what it is, exactly, that we do, wrapped around some huge logical fallacies. They are also arguments we have dealt with on numerous occasions before, so he could have saved time by just reading some of the very literature he felt knowledgeable enough to criticize. (And as an aside, the “skeptical of skeptics” meme is way overdone and ready to be retired.)

If I had to give a paraphrasing executive summary of Prasad’s article it would be this – medical skeptics should stop focusing on what they think is important, and should instead focus on what I think is important, even though I don’t really understand what it is that they do. In fact there is so much wrong with Prasad’s article it’s hard to know where to begin, but let’s start with some basic framing. Part of what Prasad is criticizing is our science communication (scicom), but again he seems to be unaware that scicom is a field unto itself, and so he is making some basic false assumptions, without being aware that he is doing so. This false assumption leads Prasad to conclude that medical experts should restrict themselves to the big problems within their area of medical expertise, without seeming to realize that scicom itself is an area of expertise.

Before I go further it is important to understand what we in the Science-Based Medicine and broader skeptical community do, and what our expertise actually is. First, we are science communicators, and this involves studying science communication itself. The big lessons of the last few decades, backed by actual research, is that the old “knowledge deficit” paradigm is mostly incorrect (not completely) and definitely insufficient. In most contexts you cannot change the way people think or behave by just giving them facts. You have to also engage with what they already believe and the complex motivations and patterns of belief that underlie them. Scicom involves, therefore, not just addressing scientific literacy but also critical thinking skills and media literacy. And in order to do this you need to understand the complex relationship between science and pseudoscience, and cognitive biases, conspiracy thinking, science-denial, and a host of other “critical thinking” skill sets.

Continue Reading »

No responses yet

Nov 30 2020

AI Doctor’s Assistant

I have discussed often before how advances in artificial intelligence (AI) are already transforming our world, but are likely to do so much more in the future (even near term). I am interested in one particular application that I think does not get enough attention – using AI to support clinical decision-making. So I was happy to read that one such project will share in a grant from the UK government.

The grant of £20m will be shared among 15 UK universities working on various AI projects, but one of those projects is developing an AI doctor’s assistant. They called this the Turing Fellowship, after Alan Turing, who was one of the pioneers of machine intelligence. As the BBC reports:

The doctor’s assistant, or clinical colleague, is a project being led by Professor Aldo Faisal, of Imperial College London. It would be able to recommend medical interventions such as prescribing drugs or changing doses in a way that is understandable to decision makers, such as doctors.

This could help them make the best final decision on a course of action for a patient. This technology will use “reinforcement learning”, a form of machine learning that trains AI to make decisions.

This is great to hear, and should be among the highest priority in terms of developing such AI applications. In fact, it’s a bit disappointing that similar systems are not already in widespread use. There are several types of machine learning. At its core, machine learning involves looking for patterns in large sets of data. If the computer algorithm is being told what to look for, then that is supervised learning. If not, then it is unsupervised. If it’s using lots of trial and error, that is reinforcement learning. And if it is using deep neural networks, then it is also deep learning. In this case they are focusing on reinforcement learning, so the AI will make decisions, be given feedback, and then iterate its decision-making algorithm with each piece of data.

Continue Reading »

No responses yet

Nov 23 2020

Weaponizing Conspiracies

Published by under Conspiracy Theories

In 2019 PopSci published a flow chart they called “How to Start a Conspiracy Theory.” It’s not really about conspiracy theories themselves, but rather how to popularize an extreme idea. Many extreme claims are conspiracy theories, or at least incorporate conspiracy thinking as a way to justify themselves, so there is a lot of overlap.

What the chart really reflects is how to use social media and other outlets to weaponize disinformation. Let’s take a look at what I think are the main features, and then we can see how they apply specifically to conspiracy theories. The process starts by coming up with an idea that “resonates” with the public. This is probably the hard part as there are lots of ideas out there, and it is difficult to just invent something that will go viral. This is more like winning the lottery than an engineered result. But essentially the flow chart reflects an iterative process by which you keep tweaking the idea until it takes off.

If your goal is to manufacture viral misinformation, there are a few ways to almost guarantee this will work. The first is to already be plugged into a major information outlet, like a news network, a political party, or a celebrity. This is no guarantee, but it magnifies the chances of success by orders of magnitude over just being a member of the general public. This can also work indirectly if you have the resources to push your idea through those outlets (such as lots of money, or the resources of a country).

You can also crowd-source the iterative process. This is essentially what happens when there is an existing information ecosystem surrounding an ideology. For example, anti-vaxxers are already well established enough to have their own social media ecosystem, and they can collectively iterate ideas in their internal incubator, and then push those that seem to work best. Extreme political ecosystems work the same way, pushing all kinds of crazy ideas internally with their loyal base, and then trying to export them to the mainstream media. Occasionally an idea will hit.

Continue Reading »

No responses yet

Nov 10 2020

Pre-Bunking Game

A new game called Harmony Square was released today. The game hires you, the player, as a Chief Disinformation Officer, and then walks you through a campaign to cause political chaos in this otherwise placid town. The game is based upon research showing that exposing people to the tactics of disinformation “inoculates” them against similar tactics in the real world. The study showed, among other things, that susceptibility to fake news headlines declined by 21% after playing the game. Here is the full abstract:

The spread of online misinformation poses serious challenges to societies worldwide. In a novel attempt to address this issue, we designed a psychological intervention in the form of an online browser game. In the game, players take on the role of a fake news producer and learn to master six documented techniques commonly used in the production of misinformation: polarisation, invoking emotions, spreading conspiracy theories, trolling people online, deflecting blame, and impersonating fake accounts. The game draws on an inoculation metaphor, where preemptively exposing, warning, and familiarising people with the strategies used in the production of fake news helps confer cognitive immunity when exposed to real misinformation. We conducted a large-scale evaluation of the game with N = 15,000 participants in a pre-post gameplay design. We provide initial evidence that people’s ability to spot and resist misinformation improves after gameplay, irrespective of education, age, political ideology, and cognitive style.

While encouraging, I think there are some caveats to the current incarnations of this approach. But first, let me say that I think the concept is solid. The best way to understand mechanisms of deception and manipulation is to learn how to do them yourself. This is similar to the old adage – you can’t con a con artist. I think “can’t” is a little strong, but the idea is that someone familiar with con artist techniques is more likely to spot them in others. Along similar lines, there is a strong tradition of skepticism among professional magicians. They know how to deceive, and will spot others using similar deceptive techniques. (The famous rivalry between James Randi and Uri Geller is a good example of this.)

Continue Reading »

No responses yet

« Prev - Next »