Aug 03 2020

Do Your Own Research?

A recent commentary on Forbes advises: You Must Not ‘Do Your Own Research’ When It Comes To Science. I agree  with everything the author, Ethan Siegel, says in the piece. It was a good start – but did not go far enough. For example, he did not really reach any conclusion about what people should actually do, beyond “listen to the experts.” OK – how, exactly, do we do that? This is not a criticism (I have written similar articles before) but an observation: after trying to communicate these same skeptical themes for decades and getting thousands of questions from the public, I have realized that it is perhaps not so obvious what it means to listen to the experts.

First let me amplify what Siegel gets right, although I may reframe it a bit. He correctly describes the typical process that people use when evaluating new information, although does not name it – confirmation bias. His summary is as good as any:

  • formulating an initial opinion the first time we hear about something,
  • evaluating everything we encounter after that through that lens of our gut instinct,
  • finding reasons to think positively about the portions of the narrative that support or justify our initial opinion,
  • and finding reasons to discount or otherwise dismiss the portions that detract from it.

There is more nuance to this process that I have discussed here previously. For example, there seems to be a significant difference in how we approach claims depending on our emotional connection to them. The less our emotional stake, the more rational we are. For those claims that cut to the heart of our beliefs and identity, there may even be a backfire effect when encountering information that violates our preferred narrative. We may also be operating under several narratives simultaneously with respect to a single claim or issue. Further, in addition to confirmation bias (confirming what we already believe) there is preference bias – easily switching to what we want to believe even if we have previously grudgingly accepted a different reality.

And of course there are many biases and cognitive flaws out there other than confirmation bias that guide our beliefs and how we approach new claims and information and perceive and remember reality.

But there is also another layer here that Siegel did not explore. (Again, this is not criticism, one article can only be so long and has to focus. This is really a book-length topic – I know because I wrote a book about it.) Not only are we biased by our preferred narratives, these narratives have been weaponized by others for political, ideological, and monetary purposes. We are not just freely navigating information and deciding where to go, we are often being led by the hand down a specific path, even if we don’t realize it. That path may emerge from the collective marketing of an industry, or the clever manipulations of a single con-artist or cult leader. Political groups also collectively create, cultivate, and spread their narratives, both passively and actively. Social media algorithms curate information for us, based on criteria that may not be serving our best interests, or those of society.

In fact it’s hard to know what force – internal confirmation bias or externally imposed narratives – is a greater influence. They do work together synergistically. But sometimes the external forces predominate – people “go down rabbit holes” on the internet and become radicalized, conspiracy theorists, anti-vaxxers, flat-earthers, or just profoundly confused.

It also seems like there is a meta-narrative at work, running through many specific narratives – this is the anti-truth movement. A number of forces have found, whether consciously or not, that their job of pushing their specific narrative would be easier if those pesky scientists, journalists, and experts were not in the way. So they have created their own experts, their own institutions, and tried very hard to level the playing field. They have watered down the very notion of expertise, railed against the notion of scientific consensus, used doubt not as a tool of exploration but as a weapon to bludgeon the public into a stupor, and raised confirmation bias to a strategy. And it’s working.

Health fraud is now “alternative medicine.” Embarrassing facts are now “fake news.” Facts are now just opinions. Quality standards are only a barrier to freedom, or worse, a mechanism of oppression. And of course – everything is a conspiracy.

This, unfortunately, is the world in which Siegel wants us to simply listen to the experts. It is now a legitimate question for a non-expert member of the pubic to ask – which experts? The article tells readers not to do their own “research” but then what do they do? He uses as an example the fluoridation of public water systems. He is correct that fluoridation is an evidence-based, safe, and effective public health measure. But he does not say how someone in the public is supposed to just know that, or even how will they know when they look up the question which of the many answers to believe. You have to know how to properly search on the question. You need to know how to evaluate resources. Which sites are reliable, which aren’t, which ones represent a true consensus, and how solid that consensus is.

On some scientific questions, there is no real consensus, or perhaps there is somewhat of a consensus, but there are legitimate schools of thought or valid minority opinions. How do we know, without the prior expertise ourselves? There are groups out there pretending to be objective groups of scientists, but spouting ideologically tinged information, or even rank pseudoscience. How is someone without adequate scientific literacy to know, for example, that the Discovery Institute is a total sham? Is Dr. Oz a legitimate expert? He’s a real doctor, by all accounts a respected surgeon. Yeah, now he is a TV hack, but seeing that presumes background knowledge many in the public do not have.

We can look for seals of approval, but these are faked also. We can rely on the government to vet experts and institutions, and this does work in certain contexts. But the government also is beholden to politics, and when the streams are crossed you get ideological science.

There is no easy answer to this question. I have taken a stab at it before, and here it is again:

There is no substitution, of course, for a public who is scientifically literate, understands the basics of critical thinking, and is media savvy. How to find reliable information, vet the source, and tell the difference between legitimate outlets and slanted or even fake outlets is now absolutely essential.

Specifically finding and understanding what the consensus of genuine scientific opinion is, is now a critical skill. It is often not easy. I do this all the time – search for a definitive and authoritative consensus on a specific question. It can take a lot of time and effort, and sifting through a lot of less-than authoritative information.

Also – what do we call this process? People colloquially call this – doing research. This is problematic, but it may be too late to find another term that has a more accurate connotation. Even scientists call this research – literature research or background research. So in essence Siegel is saying – don’t do your own research, instead research what the consensus of scientific opinion is. This is a little problematic for science communication.

I don’t really have a better term, however. Often we talk about doing a “deep dive” on an issue, but that is not always appropriate as it implies extensive “research”. So, I am open to suggestions. It also may not be that important in the grand scheme of things. I would rather people just understand the basics as I outlined them above.

 

No responses yet