Dec 20 2018

The Really Worst Pseudoscience of 2018

This is a continuation of my previous post, but I am not going to simply add to the list. Rather, I am going to discuss how the general phenomenon of pseudoscience has continued to evolve in 2018. There were certainly many candidates for specific pseudosciences I have not yet covered on this list – the raw water nonsense, flat-earthers, anti-GMO propaganda, more alternative medicine and free energy claims, and a continuation of all the pseudosciences from previous years.

It is important to address specific claims, drilling down to individual facts and arguments, but it is also important to step back and look at the cultural and institutional patterns behind those specific claims.

The real story over the last few years is that of fake news. This is actually a multi-headed monster, with completely fake news stories, biased and distorted news, and real news dismissed as fake. What these variations all have in common is the blurring of the lines between valid and invalid, legitimate and fake, fact and opinion, skepticism and denial, and expertise vs elitism.

Distinguishing real from fake has always been a challenge, and there is also the demarcation problem – there is often a fuzzy line between the two, not a clear bright line. Also, experts make mistakes, the consensus of opinion is sometimes wrong, there is bias and fraud in science, corporations often put their thumb on the scale – and people, in general, are flawed, so their institutions are also flawed. For these and other reasons, most of the things you think you know are wrong, or at least incomplete, distorted, misleading, or flawed.

The optimal approach to figuring what is likely to be true in a complex world full of flawed people spreading misinformation, therefore, requires time and nuance. But nuance takes work, and it is not as satisfying as having a simple clear narrative that is emotionally appealing. But this is the line that skeptics walk. If you care about what is really true, and not just supporting your tribe and stroking your emotions, then you have to be content with partial tentative conclusions. You have to be willing to change your mind when evidence and arguments suggest you should. You need to see all sides of an issue, and give the devil his due.

Nuance means everything is grey. Science generally works, but with lots of mistakes and occasional fraud along the way.  Government regulations can be effective, but are subject to political manipulation, and often involve difficult trade-offs. Conspiracies happen, but belief in conspiracies can take on a life of its own and trap someone into an enclosed and insulated belief-system. Everything requires context, caveats, error bars, and constant re-examination.

Going against this nuanced approach is much of human nature. This is too complex a topic to adequately summarize here, but suffice to say I have written literally thousands of articles here on the topic. You can also (shameless plug) read my book, which was released this year – The Skeptics Guide to the Universe – which is a thorough yet breezy review of all this.

Civilization is partly a struggle against human nature – to compensate for the worst aspects of it. We partly do this by replacing the rule of people with the rule of law, and similarly by replacing belief with process, and then institutionalizing that process. Science is a process, philosophy is a process, and academic scholarship is a process. Generic virtues of scholarship compensate for the failings of human thought and memory. They involve standards, like considering all information, recording data objectively, quantifying where possible, peer-review, replication, examining logic for soundness, and letting ideas compete in an open marketplace, allowing the best ideas to flourish.

What I think has been happening recently is that the institutions we have built to carry out the processes we have developed to figure out reality from fiction have been taking a hit on many fronts. The most obvious, perhaps, and the one that many people point do is the disruptive technology of social media. The internet, the web, and social media have allowed for the rapid and free communication of information faster and more broadly than ever before, with little cost in resources. This is not the first time this has happened – and others have pointed out that the same concerns emerged when the printing press was invented.

Traditional means of mass communication were built with at least some editorial process in place, used to impose at least some quality control. Again – no one will say this was perfect, or without major trade-offs. But it was a process. Social media effectively bypasses the editorial process, and therefore all quality control.

Further, social media quickly removed the traditional cues people relied on to tell if something was likely to be legitimate. Anyone could get a slick website with little investment, and look just as legit as any institution. You didn’t need a brick-and-mortar infrastructure, or years of experience, or the respect of your peers. Anything could be faked or duplicated.

We seem to be going through a process of figuring out how to have some quality control on such an open platform, but the problem is any system of quality control can be gamed or faked itself. It’s just too easy to do it in electrons.

This also cuts both ways – not just for boosting fake news, propaganda, marketing, or opinions as if they were genuine scholarship, but also dismissing genuine scholarship as if it is fake. This is what the “fake news” phenomenon is really about. Once those interested in quality control start to examine and expose fake information online, the superficial aspects of their methods and reporting were duplicated to label real news as fake. The goal of this is to muddy the waters beyond recognition.

Part of this effort is to directly attack anyone who is trying to provide scholarly examination. Snopes, for example, does a good job of investigating online claims and giving a quick summary if a claim is true or false. So, of course, they are attacked by the purveyors of fake information who don’t like being exposed. (I am not saying they don’t make mistakes, but they are transparent, fix errors, and do a generally good job.)

Beyond attacking individuals and institutions, the charlatans and ideologues have been attacking the very idea of expertise itself. You may have facts, but I have alternative facts. Your experts say X, but my experts say Y. Experts are all pointy-headed elitists who don’t care about regular people anyway. It’s all a giant conspiracy. Anyone who disagrees with me is a shill. Science I don’t like is dismissed as bought and paid for.

This whole process has also been driven by extremism and partisanship. Scholarship does tend to filter out the most extreme views, but without these filters the extremes can thrive. They may do even better than moderate views, precisely because they are less nuanced. They are simple and appealing. They also feed on themselves – they tend to radicalize people to make them more and more receptive to more and more extreme ideas.

Social media has played into this effect by using algorithms that encourage echochambers. If you watch videos with a certain political point of view, you will be fed videos with more and more extreme views in that direction. Echochambers can also be deliberate, catering to particular points of view. If you are an anti-vaxxer, you can live online without ever being challenged by someone who rejects your pseudoscience. If they do manage to infiltrate your safe-spaces, they are almost immediately banned as trolls. But again – there are real trolls, but the behavior is somewhat context-dependent and subjective.

Similarly, there are real astroturf campaigns, in which corporations or even governments fake grassroots movements through social media. But – legitimate grassroots campaigns have also been unfairly accused of being astroturf by their ideological opponents. Every legitimate tool of quality control is almost instantly perverted to oppose quality control.

I don’t know where this will all land. Optimally, we would have an information landscape that has a good balance of quality control with the free exchange of ideas, all with transparency and fairness. Right now we have been shifted through disruptive technology toward the extreme open end of the spectrum, at the expense of quality control. This system does not work, partly because we live in a world with psychopaths, frauds, charlatans, ideologues, and snake-oil peddlers who will exploit the openness to harass and abuse everyone else – while making tons of money in the process.

Openness is good, but there also needs to be a valid mechanism by which the average person can have some sense of when information is reliable and legitimate. Life is now way too complex for everyone to be just on their own. There aren’t enough hours in the day to evaluate every single piece of information that you need to make important decisions for your life. We need reliable short-cuts, such as respected experts who can summarize complex information.

Perhaps we just need the traditional institutions of scholarship, journalism, and expertise to exert themselves in this new medium. We will see. The only thing I am confident of now is that the situation is likely to change, and probably in ways no one currently expects.

Meanwhile, we increasingly do have to figure things out for ourselves. That is why critical thinking skills, and basic scientific literacy, is more important than ever.

No responses yet