Dec 03 2015

Detecting BS

A new study is getting a lot of attention, partly because of its provocative title: On the reception and detection of pseudo-profound bullshit. It also seems that people generally like to hear stories about how dumb other people are. That is why I often emphasize that such studies are not about “other” people, they are about people.

In this case, however, the researchers do find that there are subsets of subjects who react differently to what they call “pseudo-profound bullshit.” They write:

Here we focus on pseudo-profound bullshit, which consists of seemingly impressive assertions that are presented as true and meaningful but are actually vacuous. We presented participants with bullshit statements consisting of buzzwords randomly organized into statements with syntactic structure but no discernible meaning (e.g., “Wholeness quiets infinite phenomena”).

They further describe such statements:

This sort of phenomenon is similar to what Buekens and Boudry (2015) referred to as obscurantism (p. 1): “[when] the speaker… [sets] up a game of verbal smoke and mirrors to suggest depth and insight where none exists.”

As source material for their studies the authors used, which is designed to generate random but syntactic phrases that sound like something Deepak Chopra would say. They also used which is a newage bullshit generator. Finally they used actual tweets from Deekpak Chopra.

The newage generator is particularly good. Here is the first random phrase I generated:

To traverse the journey is to become one with it. Today, science tells us that the essence of nature is choice.

Subjects were asked to rate statement on a profoundness scale from 1-5. In the first study, using the two random phrase generators, the average rating was 2.6. A subset, 27%, gave ratings >3, which is “fairly profound.” A smaller subset, 18%, gave ratings <2. In general the participants failed to recognize the statements as vacuous, but there were subsets of skeptics and believers.

In a second study they used the same protocol but with tweets from Deepak Chopra. Participants overall rated Chopra’s statements as slightly more profound than the random statement, however, 2.7 vs 2.4. The pattern of responses was very similar, however.

As a control the researchers also studies mundane statement (“infants require constant attention”) and motivational statements (“a wet person does not fear the rain”). In general the study subjects rated mundane statements low, mostly 1, with a small group of outliers who apparently think everything is profound. Subjects tended to rate legitimate motivational statements as more profound.

Finally the researchers rated participant by their responses to the random phrases and Deepak tweets with a “Bullshit receptivity scale” or BSR. They found that a high BSR correlated with belief in the paranormal, belief in conspiracy theories, religious belief, faith in intuition, ontological confusion (not understanding the nature of knowledge), and low scores for analytical thinking, verbal intelligence, and numeracy. They conclude:

Those more receptive to bullshit are less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to ontological confusions and conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine. Finally, we introduced a measure of pseudo-profound bullshit sensitivity by computing a difference score between profundity ratings for pseudo-profound bullshit and legitimately meaningful motivational quotations. This measure was related to analytic cognitive style and paranormal skepticism.

I try to be especially careful when a study seems to support what I already believe, in this case that there are subsets of people whose habits of thought tend toward intuitive thinking and belief in magic, with another subset (skeptics) who tend to have more analytical thinking and are sensitive to detecting bullshit, with most people being somewhere in between. This is just one study, but consistent with research generally in this area.

There are some important caveats. It is possible that some people tend to rate vacuous statements as profound because they do not understand the statements but don’t want to appear stupid. This is the “emperor’s new clothes” phenomenon.  They assume their lack of understanding means the statement must have hidden meaning, perhaps because they lack the confidence to base a critical judgement on their own inability to make sense of a statement.

Vacuous statements may also act as a type of Rorschach test – people reflect their own belief onto the statements. This follows an intuitive style of thinking. It is a similar phenomenon to thinking that astrological readings accurately describe oneself, or finding accuracy is a psychic’s cold reading. Rather than critically dissecting what the statement is actually saying (or not saying) they fill the empty vessel with their own “wisdom.”


I will let the author’s have the final word:

Chopra is, of course, just one example among many. Using vagueness or ambiguity to mask a lack of meaningfulness is surely common in political rhetoric, marketing, and even academia (Sokal, 2008). Indeed, as intimated by Frankfurt (2005), bullshitting is something that we likely all engage in to some degree (p. 1): “One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share.” One benefit of gaining a better understanding of how we reject other’s bullshit is that it may teach us to be more cognizant of our own bullshit.

77 responses so far