One of the (perhaps) good things to come out of the recent political climate in the US is a broader appreciation for the need to teach critical thinking skills. I hope we can capitalize on this new awareness to make some longstanding changes to our culture.
For example, a recent NYT article is titled: “Why People Continue to Believe Objectively False Things,” and begins:
“Everyone is entitled to his own opinion, but not his own facts,” goes the saying — one that now seems like a relic of simpler times.
The article also discusses recent evidence showing that belief in the “birther” Obama conspiracy decreased after Trump admitted that Obama was born in Hawaii. Shortly after that admission 62% of people stated they believed Obama was a US citizen, but a more recent poll shows the number dropped to 57%. (Over that period of time fewer Republicans believed he was a US citizen, while more Democrats did.) The authors conclude that over time people forget specific information while they revert to old tribal beliefs.
A recent study looking at Twitter activity also reinforces the evidence that people generally follow their instincts rather than critical thinking. They showed that people will rate the believability of a tweet as higher, and are more likely to share that tweet, if it already has a high number of retweets. This creates a positive feedback loop in which retweets beget retweets, regardless of the inherent reliability of the information.
Continue Reading »
Aflatoxin is a serious food contaminant that causes both acute and chronic illness in animals and humans. It was first discovered in 1960 when 100,000 turkeys in the UK died over the course of a few months. Their deaths were tracked to a nut-based feed that was contaminated with a newly discovered toxin, named aflatoxin.
Aflatoxin is a group of 20 toxins produced by a fungus, Aspergillus species. According to Food Safety Watch:
Aflatoxins may be present in a wide range of food commodities, particularly cereals, oilseeds, spices and tree nuts. Maize, groundnuts (peanuts), pistachios, brazils, chillies, black pepper, dried fruit and figs are all known to be high risk foods for aflatoxin contamination, but the toxins have also been detected in many other commodities. Milk, cheese and other dairy products are at risk of contamination by aflatoxin M. The highest levels are usually found in commodities from warmer regions of the world where there is a great deal of climatic variation.
Corn is perhaps the biggest source of aflatoxin contamination. It is estimated that 16 million tons of corn are disposed of each year due to aflatoxin contamination. The toxin is highly stable and can survive most types of food processing.
Acute toxicity can result in death when severe. Chronic toxicity is difficult to detect, and the most common effect is liver damage and increased risk for liver cancer.
Many techniques are used to minimize contamination, but even with these methods aflatoxin is a huge source of food waste and an important cause of human illness, especially in developing countries. Continue Reading »
In 2008 Thaler and Sunstein published their book, Nudge, advocating for a more nuanced approach to changing public behavior. Since then nudge theory has been quite popular but hasn’t created the revolution optimists had hoped.
Here is the core problem: people do not always act in their own best interest. Sometimes this affects only them, but often the negative impacts affect the people around them, their family, and even society as a whole. An obvious example is vaccinations.
There are many less-obvious examples, however. Poor health care decisions increase the cost of health care, which is a rapidly increasing burden on society. Poor financial decisions can leave people in debt, might cause them to default on those debts, and have an overall negative impact on the economy. We all share risks through insurance premiums and public costs.
And, we actually care about people. We are a social species and we do generally have empathy for others (unless they have been psychologically relegated to an out-group). It is also some people’s job to care about people.
Therefore, for various reasons, there are individuals and groups who care about changing other people’s behavior for their own good and for the good of society. This paternalism runs up against several obstacles.
Continue Reading »
Tom Nichols’ book, “The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters,” is currently on the Amazon bestsellers list. The book discusses a topic I have delved into many times here – what are the current general attitudes of the public toward experts and expertise, and how did we get here?
He mentions various aspects to this war against experts:
“The United States is now a country obsessed with the worship of its own ignorance. Many citizens today are proud of not knowing things. Americans have reached a point where ignorance, especially of anything related to public policy, is an actual virtue.”
The culture and our educational system have created a generation that has little experience being told they are objectively wrong. Everyone feels they are entitled to be right. Combine this with the illusion of knowledge provided by Google, and everyone thinks they are their own expert in anything.
Interestingly, as Nichols also points out, people are arbitrarily selective in which experts they respect. Sports is a great example. No one really thinks they should play for the NFL and begrudges recognizing that NFL players are the result of a combination of natural talent and years of developing physical ability and specific skills.
Continue Reading »
It appears that Google has removed all Natural News content from their indexing. This means that Natural News pages will not appear in organic Google searches.
This is big news for skeptics, but it is also complicated and sure to spark vigorous discussion.
For those who may not know, Mike Adams, who runs Natural News, is a crank conspiracy theorist supreme. He hawks snake oil on his site that he markets partly by spreading the worst medical misinformation on the net. He also routinely personally attacks his critics. He has launched a smear-campaign against my colleague, David Gorski, for example.
A few years ago Adams put up a post in which he listed people who support the science of GMOs to the public, comparing them to Nazis and arguing that it would be ethical (even a moral obligation) to kill them. So he essentially made a kill-list for his conspiracy-addled followers. Mine was one of the names on that list, as were other journalists and science-communicators.
In short Adams is a dangerous loon spreading misinformation and harmful conspiracy theories in order to sell snake oil, and will smear and threaten those who call him out. He is an active menace to the health of the public.
Adams is a good example of the dark underbelly of social media. It makes it possible to build a massive empire out of click-bait and sensationalism.
Continue Reading »
H. Sterling Burnett, writing for the Heartland Institute blog, wrote a revealing post titled: Energy Restrictions, Not Climate Change, Put Civilizations at Risk. In my opinion it is a classic example of misleading propaganda, worthy of deconstruction as a case study.
What Is Propaganda?
I always endeavor to be as clear, thorough, and fair in my writing as possible. I am not saying I always succeed, but that is my goal. I have been influenced by my scientific background where clarity and accuracy rises to the level of obsession in the technical literature. It’s not possible to achieve that level in a non-technical blog, but it is a good ideal.
Propaganda is the opposite of clear, thorough, and fair. The purpose of propaganda is to persuade the reader to an ideological or political opinion, or to impugn or cast doubt on other people or other ideas. Being persuasive in an of itself does not make communication propaganda. In order to rise to that level there has to be a willful distortion of facts, a selective use of arguments and information, and the marshaling of any points that suit your ends, regardless of how fair they are.
Propaganda, like pseudoscience, exists on a spectrum. This further means that there is a demarcation problem – there isn’t going to be a bright line beyond which communication is clearly propaganda.
Burnett’s article shows multiple dramatic examples of what constitutes propaganda, and so should serve as an instructive example. This is not surprising since The Heartland Institute is an ideological think tank. They are not a scientific organization. Continue Reading »
It’s official – 2016 is the warmest year on record, since we have been tracking global temperatures since 1880. This is the third year in a row that the current year has been the warmest, 2014 and 2015 were also the warmest years on record. In fact, 15 of the warmest years on record have all been since 1998 inclusive. The last time we had a coldest year on record was 1911.
That the Earth is warming is now undeniable, but that does not stop people engaged in motivated reasoning from denying it. The graph to the right shows temperature variance from average since 1880. It is visually very compelling.
Here is how motivated reasoning works, however. Someone without ideological skin in the game would fairly assess all the data, acknowledge uncertainty and complexity, but arrive at the fairest conclusion. Motivated reasoning exploits uncertainty and complexity to deny the reality which is causing cognitive dissonance brought about by a conflict between reality and ideology.
As you might imagine, there is a lot of complexity in determining average global temperatures. It’s not as if the Earth has a magical thermostat. There are various places to measure temperature, from surface temperature, high altitude temperature, and ocean temperature. You can also use various methods, including ground stations and satellites. Further, you have to correct for any potential source of artifact. For example, there is the heat-island effect. As cities grow they generate more heat, and if you have a temperature measuring station near a city it will measure this heat.
Continue Reading »
As we collectively try to climb out of the smouldering rubble that was “the truth” in 2016 (by which I mean basic intellectual integrity), many people are speculating and trying to wrap their brain around what exactly is happening. Of course, the arbitrary transition to a new calendar year changes nothing. We are still living in the same world that produced the 2016 election.
Many writers have characterized what happened as the “weaponization” of bullshit or misinformation. This is not entirely new, but it did seem to reach new heights, or to cross over some fuzzy threshold to a new level or prominence. The “weaponized” meme is also mainstream; Donna Brazile, for example, is saying that the hacked DNC e-mails were “weaponized” against them.
The two other similar memes that emerged this past year were “post truth” and “fake news.” These were added to older notions of “echochambers” or the fact that many people are living in information bubbles (whether they know it or not).
I think all of these concepts are essentially correct. We are in the midst of a misinformation war (actually many wars on many fronts). Unfortunately, it seems that the side which includes the mainstream media, the experts that provide them with information and analysis, professional journalists and academics, is losing. They are losing primarily because they have not yet adapted to the new battleground – social media. They are like the British fighting in neat rows with their visible red uniforms, while the rebels fire at them concealed behind trees and stone walls.
Continue Reading »
A recent neuroscientific study looked at what happens in the brains of subjects when their beliefs were challenged. The study adds a new bit of evidence to our understanding of motivated reasoning.
Before we get to the details of the study, let’s review what we mean by motivated reasoning. Psychological studies have shown that people treat different beliefs differently. Specifically, there is one set of beliefs that are core to a person’s identity and to which they have an emotional attachment. We treat such beliefs differently than all other beliefs.
For most beliefs people actually are quite rational at baseline. We tend to follow a Bayesian approach, meaning that we update our beliefs as new information comes to our attention. If we are told that some historical fact is different than what we remember, we will quickly change our beliefs about that historical fact. Further, the more information we have about something, the more solid our belief is, the more slowly we will change that belief. We don’t just change from one thing to the next, we incorporate the new information with our old information.
This is actually a very scientific approach. I would not easily change my belief that the sun is at the center of our solar system. It would take a profound amount of very reliable information to counter all the solid scientific information on which my current belief is based. If, however, I was told from a reliable source something about George Washington I never heard before, I would accept it much more quickly. This is reasonable, and this is how most people function day-to-day.
Continue Reading »
Because I am an activist skeptic I am often asked specific questions about how to be a better skeptic. This is obviously a complex question, and I view skepticism (like all knowledge) as a journey not a destination. I am still trying to work out how to be a better skeptic.
One recent question, however, took a great approach to the issue of practical skepticism – what questions should a skeptic ask themselves when confronted with a news item? Here is my process:
1 – How plausible is the claim?
This is admittedly a tricky question that requires a lot of judgment. The risk is that you will think any claim that already aligns with your beliefs as being plausible and anything that contradicts them as being implausible. This is not as bad as it sounds, however, if your current beliefs are based on logic and evidence. To the extent that your beliefs (by which I mean the model of reality that you construct in your head) are based on ideology and subjective perspective, the notion of plausibility can be self-fulfilling.
I say “can be” because it does not have to be. This is partly because this first question regarding plausibility is the first question, not the only question. You should not reject implausible claims out of hand. The purpose of evaluating plausibility is to determine the appropriate bar of evidence needed to accept the claim. This is essentially, “Extraordinary claims require extraordinary evidence.”
Continue Reading »