On skepticism, pseudo-profundity, Deepak Chopra, and bullshit

Of all the slick woo peddlers out there, one of the most famous (and most annoying) is Deepak Chopra. Indeed, he first attracted a bit of not-so-Respectful Insolence a mere 10 months after this blog started, when Chopra produced the first of many rants against nasty “skeptics” like me that I’ve deconstructed over the years. Eventually, the nonsensical nature of his pseudo-profound blatherings inspired me to coin a term to describe it: Choprawoo. Unfortunately, far too many people find Deepak Chopra’s combination of mystical sounding pseudo-profundity, his invocation of “cosmic consciousness” and rejection of genetic determinism, and his advocacy of “integrating” all manner of quackery into real medicine (a.k.a. “integrative medicine, formerly “complementary and alternative medicine,” or CAM) to the point of getting actual legitimate medical school faculty to assist him with an actual clinical trial compelling. He is, alas, one of the most influential woo peddlers out there. Worse, he was once a legitimate MD; now he’s a quack. Indeed, as I’ve described before, of all the quacks and cranks and purveyors of woo whom I’ve encountered over the years, Deepak Chopra is, without a doubt, one of the most arrogantly obstinate, if not the most arrogantly obstinate. Right now he’s pushing his latest book, Supergenes: Unlock the Astonishing Power of Your DNA for Optimum Health and Well-Being, which asserts that you can control the activity of your genes.

So it was greatly amusing to me to see Deepak Chopra and his pseudo-profound bullshit (and I use the term because the source I’m about to look at uses the term) featured so prominently in a new study by Pennycook et al entitled On the reception and detection of pseudo-profound bullshit. The study was performed at the Department of Psychology, University of Waterloo, and the School of Humanities and Creativity, Sheridan College. Indeed, Deepak Chopra’s pseudo-profound bullshit is a key component of the study. I love the way the abstract starts, too:

Although bullshit is common in everyday life and has attracted attention from philosophers, its reception (critical or ingenuous) has not, to our knowledge, been subject to empirical investigation. Here we focus on pseudo-profound bullshit, which consists of seemingly impressive assertions that are presented as true and meaningful but are actually vacuous.

First, what do the authors mean by pseudo-profound bullshit? I might as well quote their definition in full, even at the risk of a large block of quoted text:

The Oxford English Dictionary defines bullshit as, simply, “rubbish” and “nonsense”, which unfortunately does not get to the core of bullshit. Consider the following statement:

Hidden meaning transforms unparalleled abstract beauty.

Although this statement may seem to convey some sort of potentially profound meaning, it is merely a collection of buzzwords put together randomly in a sentence that retains syntactic structure. The bullshit statement is not merely non- sense, as would also be true of the following, which is not bullshit:

Unparalleled transforms meaning beauty hidden abstract.

The syntactic structure of a), unlike b), implies that it was constructed to communicate something. Thus, bullshit, in contrast to mere nonsense, is something that implies but does not contain adequate meaning or truth. This sort of phenomenon is similar to what Buekens and Boudry (2015) referred to as obscurantism (p. 1): “[when] the speaker… [sets] up a game of verbal smoke and mirrors to suggest depth and insight where none exists.” Our focus, however, is somewhat different from what is found in the philosophy of bullshit and related phenomena (e.g., Black, 1983; Buekens & Boudry, 2015; Frankfurt; 2005). Whereas philosophers have been primarily concerned with the goals and intentions of the bullshitter, we are interested in the factors that pre- dispose one to become or to resist becoming a bullshittee. Moreover, this sort of bullshit – which we refer to here as pseudo-profound bullshit – may be one of many different types. We focus on pseudo-profound bullshit because it rep- resents a rather extreme point on what could be considered a spectrum of bullshit. We can say quite confidently that the above example (a) is bullshit, but one might also label an exaggerated story told over drinks to be bullshit. In future studies on bullshit, it will be important to define the type of bullshit under investigation (see Discussion for further comment on this issue).

This is about as fantastic an introduction to a scientific paper as I’ve ever seen. It also defines a form of BS at whose production Deepak Chopra is expert at. But how does one measure the inherent “BS-ness” of a statement? The way the authors did this was absolutely hilarious. Some of you might be aware of a website, The Wisdom of Chopra, which is a random Deepak Chopra quote generator. As the generator tells us, each “quote” is generated from a list of words that can be found in Deepak Chopra’s Twitter stream randomly stuck together in a sentence. This was one source of raw material for the authors. The other was the New Age Bullshit Generator, which was also inspired by Deepak Chopra and works on similar principles, but uses a list of profound-sounding words compiled by its creator, Seb Pearce. Examples include sentences like “Imagination is inside expo- nential space time events” and “We are in the midst of a self-aware blossoming of being that will align us with the nexus itself.” These sites were used to produce ten meaningless sentences.

Next, Waterloo University undergraduate students were asked to rate the sentences using the following 5-point scale: 1= Not at all profound, 2 = somewhat profound, 3 = fairly profound, 4 = definitely profound, 5 = very profound. Before the study started, the same students answered demographic questions and completed five cognitive tasks intended to assess components of cognitive ability. They also answered questions designed to assess religious beliefs. These students rated the ten meaningless pseudo-profound statements. This first study was to assess the BS potential of the statements and validate the internal consistency of the measures, specifically the new measure, dubbed the “Bullshit Receptivity” (BSR) scale, which had good internal consistency. Basically, the higher the BSR values attributed to these statements, the higher the, well, receptivity to BS demonstrated by the subject. The authors found that BSR was “strongly negatively correlated with each cognitive measure except for numeracy (which was nonetheless significant)” and that “both ontological confusions and religious belief were positively correlated with bullshit receptivity.”

The next study looked at some real world examples. Participants were recruited for pay from Amazon’s Mechanical Turk. In addition to the ten meaningless statements used in the above study, ten novel items were generated by the two websites, and the authors also obtained 10 items from Deepak Chopra’s Twitter feed; e.g.:

Subjects were also assessed by additional instruments, such as the Paranormal Belief Scale and measures of wealth distribution and ideology. In contrast to the first study, participants evaluated the meaningless statements before completing the cognitive tasks, and the items from Chopra’s TWitter feed folowed directly after the meaningless statements. This time around, Chopra’s Twitter items were rated as slightly more “profound” than the nonsense items, but the mean ratings for the two scales were very correlated. It also turned out that the BSR scale significantly correlated with each variable tested, except for the Need for Cognition. Specifically, BSR was negatively correlated with performance on the heuristics and biases battery and positively correlated with Faith in Intuition. As in the first study, cognitive ability measures were negatively correlated with BSR.

Finally, in the remaining two studies included in this paper, the authors wanted to test whether some people might be particularly sensitive to pseudo-profound BS because they are less capable of detecting conflict during reasoning. Basically, they wanted to try to get some insight into why some people are particularly prone to pseudo-profound BS and others aren particularly resistant to it. To test this, they did more studies in which they created a scale using ten motivational quotations that are conventionally considered to be profound (e.g., “A river cuts through a rock, not because of its power but its persistence”) because they are written in plain language and don’t contain the vague buzzwords characteristic of statements in the first two studies. They also included mundane statements that had clear meaning but wouldn’t be considered “profound” (e.g., “Most people enjoy some sort of music”). They then compared the correlations they found before.

They found that those more receptive to bullshit are “less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to ontological confusions and conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine (CAM).” The authors also assessed the same correlations using a measure of sensitivity to pseudo-profound BS determined by computing a difference score between profundity ratings for pseudo-profound BS and legitimately meaningful motivational quotations. Thus, people who rated the truly profound statements a lot higher than the pseudo-profound BS will have higher scores in this measure, which the authors propose as an estimate of how sensitive an individuals “bullshit detector” is. They found that BS sensitivity was associated with better performance on mesures of analytic thinking and lower paranormal belief. It was not, however, correlated with increased conspiratorial ideation or acceptance of CAM, which surprised the authors, who noted:

This was not predicted as all three forms of belief are considered “epistemically suspect” (e.g., Pennycook, et al., in press). One possible explanation for this divergence is that supernatural beliefs are a unique subclass because they entail a conflict between some immaterial claim and (presumably universal) intuitive folk concepts (Atran & Norenza- yan, 2004). For example, the belief in ghosts conflicts with folk-mechanics – that is intuitive belief that objects cannot pass through solid objects (Boyer, 1994). Pennycook et al. (2014) found that degree of belief in supernatural religious claims (e.g., angels, demons) is negatively correlated with conflict detection effects in a reasoning paradigm. This result suggests that the particularly robust association be- tween pseudo-profound bullshit receptivity and supernatural beliefs may be because both response bias and conflict detection (sensitivity) support both factors.

The authors make a point about different kinds of open-minded thinking, an uncritical open mind versus a more reflective open mind:

As a secondary point, it is worthwhile to distinguish uncritical or reflexive open-mindedness from thoughtful or reflective open-mindedness. Whereas reflexive open- mindedness results from an intuitive mindset that is very accepting of information without very much processing, re- flective open-mindedness (or active open-mindedness; e.g., Baron, Scott, Fincher & Metz, 2014) results from a mindset that searches for information as a means to facilitate critical analysis and reflection. Thus, the former should cause one to be more receptive of bullshit whereas the latter, much like analytic cognitive style, should guard against it.

Overall, the authors have made a significant contribution by coming up with their Bullshit Receptivity scale and Bullshit Sensitivity scale, but it is not without its limitations. For one thing, the authors focused on very brief statements, generally less than Twitter-length, which limits the statements to 140 characters. It isn’t clear whether these results can be generalized to what the authors refer to as more “conversational” BS, which can be quite different than that of pseudo-profound BS. More importantly, this is preliminary work. The scales used contained relatively few items, and there was arguably way too much focus on one person’s work or pseudo-profound BS inspired by one person: Deepak Chopra. While it’s true that he is fantastically skilled at coming up with such seemingly profound but vacuous statements and is probably the most famous person doing it, Chopra is just one person. Surely there are so many more examples that could have been mined.

Despite these differences, I think this study is an interesting, albeit flawed, first step at elucidating what factors contribute to receptivity and resistance to BS. As the authors put it:

The construction of a reliable index of bullshit receptivity is an important first step toward gaining a better understand- ing of the underlying cognitive and social mechanisms that determine if and when bullshit is detected. Our bullshit re- ceptivity scale was associated with a relatively wide range of important psychological factors. This is a valuable first step toward gaining a better understanding of the psychology of bullshit. The development of interventions and strategies that help individuals guard against bullshit is an important additional goal that requires considerable attention from cognitive and social psychologists. That people vary in their receptivity toward bullshit is perhaps less surprising than the fact that psychological scientists have heretofore neglected this issue. Accordingly, although this manuscript may not be truly profound, it is indeed meaningful.

I tell ya, social scientists are far more tolerant of self-deprecating humor than biomedical scientists are. There’s no way a statement like the last sentence would make it into a basic or clinical science paper.

Be that as it may, this study seems to confirm much that is instinctively known (or at least has been assumed): analytic thinking probably decreases susceptibility to BS; paranormal beliefs go hand-in-hand with such susceptibility. It also tells us that susceptibility to nonsense is quite widespread in the population, who tend to be far more easily persuaded by emotional, vague, seemingly “profound” appeals than they are by data, science, and evidence. The question that a study of this type always raises, of course, is whether correlation indicates causation in this case. Can deficiencies in analytic thinking and reasoning be remedied to decrease one’s susceptibility to BS, and if so what is the best way to go about this?

These are the sorts of questions skeptics have been asking for a long time. They are questions with real world consequences, because BS is everywhere.