Categories
Complementary and alternative medicine Medicine Science Skepticism/critical thinking

Knowledge versus certainty in skepticism, medicine, and science

ResearchBlogging.org

“I don’t want knowledge. I want certainty!”

–David Bowie, from Law (Earthlings on Fire)

If there’s one universal trait among humans, it seems to be an unquenchable thirst for certainty. This should come as no surprise to those committed to science and rational thinking because there is a profound conflict between our human desire for certainty and the uncertainty of scientific knowledge. The reason is that the conclusions of science are always provisional. They are always subject to change based on new evidence. Although by no means the only reason, clearly this craving for certainty the human mind appears to demand is likely to be a major force that drives people into the arms of religion, even radical religions that have clearly irrational views, such as the idea that flying planes into large buildings and killing thousands of people is a one-way ticket to heaven. However, this craving for certainty isn’t limited to religion. As anyone who accepts science as the basis of medical therapy knows, there’s a lot of the same psychology going on in medicine as well.

Although I’m not going to discuss this phenomenon primarily in the context of unscientific and pseudoscientific quackery in the “alternative” medicine world, I think it’s instructive as an example. Much of quackery involves substituting the certainty of belief for the provisional nature of science. Examples, abound. Perhaps my favorite two examples include Hulda Clark, who attributed all cancer and serious disease to a common liver fluke, and Robert O. Young, who believes that virtually all disease is due to “excess acid.” Time and time again, if you look carefully at “alt-med” concepts and the therapies that derive from those concepts, you find simplicity tarted up in complicated-sounding jargon. Homeopathy, for instance, is at its heart nothing more than sympathetic magic, with its concept of “like cures like,” combined with the principle of contagion, with its concept that water somehow has a “memory” of the therapeutic substances with which it’s come in contact but can somehow manage, as Tim Minchin so hilariously put it, forget all the poo it’s been in contact with. Reiki and other “energy healing” modalities can be summed up as “wishing makes it so,” with “intent” having the power to manipulate some fantastical life energy to heal people. It’s faith healing, pure and simple.

The simplicity of these concepts at their core makes them stubbornly resistant to evidence. Indeed, when scientific evidence meets a strong belief, the evidence usually loses. In some cases, it does more than just lose; the scientific evidence only hardens the position of believers. We see this very commonly in the anti-vaccine movement, where the more evidence is presented against a vaccine-autism link, seemingly the more deeply anti-vaccine activists dig their heels in to resist, cherry picking and twisting evidence, launching ad hominem attacks on their foes, and moving the goalposts faster than science can kick the evidence through the uprights. The same is true for any number of pseudoscientific beliefs. We see it all the time in quackery, where even failure of the tumor to shrink in response can lead patients to conclude that the tumor, although still there, just can’t hurt them. 9/11 Truthers, creationists, Holocaust deniers, moon hoaxers, they all engage in the same sort of desperate resistance to science.

They’re not alone, though. Even those who in general accept science-based medicine can be prone to the same tendency to dismiss evidence that conflicts with their beliefs. About three weeks ago, I saw an article by Christie Aschwanden discussing just this problem. The article was entitled Convincing the Public to Accept New Medical Guidelines, and I feel it could almost have been written by Orac, only minus Orac’s inimitable and obligatory “insolence” and snark. To set up its point that persuading people to accept the results of new medical science is exceedingly difficult, the article starts with the example of long distance runners who believe that taking ibuprofen (or “vitamin I”) before a long run reduces their pain and inflammation resulting from the run:

They call it “vitamin I.” Among runners of ultra-long-distance races, ibuprofen use is so common that when scientist David Nieman tried to study the drug’s use at the Western States Endurance Run in California’s Sierra Nevada mountains he could hardly find participants willing to run the grueling 100-mile race without it.

Nieman, director of the Human Performance Lab at Appalachian State University, eventually did recruit the subjects he needed for the study, comparing pain and inflammation in runners who took ibuprofen during the race with those who didn’t, and the results were unequivocal. Ibuprofen failed to reduce muscle pain or soreness, and blood tests revealed that ibuprofen takers actually experienced greater levels of inflammation than those who eschewed the drug. “There is absolutely no reason for runners to be using ibuprofen,” Nieman says.

The following year, Nieman returned to the Western States race and presented his findings to runners. Afterward, he asked whether his study results would change their habits. The answer was a resounding no. “They really, really think it’s helping,” Nieman says. “Even in the face of data showing that it doesn’t help, they still use it.”

As is pointed out, this is no anomaly. Aschwanden uses as another example a topic that’s become a favorite of mine over the last six months or so since the USPSTF released revised guidelines for mammographic screening. Take a look at what she says about the reaction:

This recommendation, along with the call for mammograms in women age 50 and older to be done every two years, rather than annually, seemed like a radical change to many observers. Oncologist Marisa C. Weiss, founder of Breastcancer.org, called the guidelines “a huge step backwards.” If the new guidelines are adopted, “Countless American women may die needlessly from breast cancer,” the American College of Radiology said.

“We got letters saying we have blood on our hands,” says Barbara Brenner, a breast cancer survivor and executive director of the San Francisco advocacy group Breast Cancer Action, which joined several other advocacy groups in backing the new recommendations. Brenner says the new guidelines strike a reasonable balance between mammography’s risks and benefits.

I discussed the guidelines and the reactions to them myself multiple times. Let’s put it this way. I’m in the business, so to speak, and even I was shocked at the vehement reaction from not just patients and patient advocacy groups, but my very own colleagues. I was particularly disgusted by the reaction of the American College of Radiology, which was nothing more than blatant fear mongering that intentionally frightened women into thinking that the new guidelines would lead to their deaths from breast cancer. Radiologists dug foxholes from which to protect their turf. Even some of my colleagues were very resistant to the guideline, and in fact I was in the distinct minority at my own institution in cautiously supporting the new guidelines–with some misgivings. At least I managed to be a moderating force to keep the press release we ended up releasing from being too critical of the new guidelines. As much as we’d like to pretend otherwise, even science-based medical practitioners can fall prey to craving the certainty of known and accepted guidelines over the uncertainty of the new. And if it’s so hard to get physicians to accept new guidelines and new science, imaging how hard it is to get patients to accept them.

There is abundant evidence of how humans defend their views against evidence that would contradict them, and it’s not just observational evidence that you or I see every day. Scientists often fall prey to what University of California, Berkeley, social psychologist Robert J. MacCoun calls the “truth wins” assumption. This assumption, stated simply, is that when the truth is correctly stated it will be universally recognized. Those of us who make it one of our major activities to combat pseudoscience know, of course, that the truth doesn’t always win. Heck, I’m not even sure it wins a majority of the time–or even close to a majority of the time. The problem is that the “truth” often runs into a buzzsaw known as a phenomenon that philosophers call naive realism. This phenomenon, boiled down to its essence is the belief that whatever one believes, one believes it simply because it’s true. In the service of naive realism, we all construct mental models that help us make sense of the world. When the “truth wins” assumption meets naive realism, guess what usually wins?

At the risk of misusing the word, I’ll just point out that the truth is that we all filter everything we learn through structure of our own beliefs and the mental models we construct to support those beliefs. I like to think of science as a powerful means of penetrating the structure of those mental models, but that’s probably not a good analogy. That’s because, for science to work at changing our preconceptions, we have to have the validity of science already strongly incorporated into the structure of our own mental models. If it’s not, then science is more likely to bounce harmlessly off of the force field our beliefs create to repel it. (Yes, I’m a geek.) As a result, all other things being equal, when people see studies that confirm their beliefs they tend to view them as unbiased and well-designed, while if a study’s conclusions contradict a person’s beliefs that person is likely to see the study as biased and/or poorly done. As MacCoun puts it, “If a researcher produces a finding that confirms what I already believe, then of course it’s correct. Conversely, when we encounter a finding we don’t like, we have a need to explain it away.”

There’s also another strategy that people use to dismiss science that doesn’t conform to their beliefs. I hadn’t thought of this one before, but it seems obvious in retrospect after I encountered a recent study that suggested it. That mechanism is to start to lose faith in science itself as a means of making sense of nature and the world. The study was by Geoffrey D. Munro of Towson University in Maryland and appeared in the Journal of Applied Social Psychology under the title of The Scientific Impotence Excuse: Discounting Belief-Threatening Scientific Abstracts.

There were two main hypotheses and two studies included within this overall study. Basically, the hypothesis was that encountering evidence that conflicts with one’s beliefs system would tend to make the subject move toward a belief that science can’t study the hypothesis under consideration, a hypothesis known as the “scientific impotence” hypothesis or method. In essence, science is dismissed as “impotent” to study the issue where belief conflicts with evidence. In addition:

The scientific impotence method of discounting scientific research that disconfirms a belief is certainly worrisome to scientists who tout the importance of objectivity. Even more worrisome, however, is the possibility that scientific impotence discounting might generalize beyond a specific topic to which a person has strong beliefs. In other words, once a person engages in the scientific impotence discounting process, does this erode the belief that scientific methods can answer any question? From the standpoint of the theory of cognitive dissonance (Festinger, 1957), the answer to this question could very well be “Yes.”

And:

Using the scientific impotence excuse for one and only one topic as a result of exposure to belief-disconfirming information about that topic might put the individual at risk for having to acknowledge that the system of beliefs is somewhat biased and possibly hypocritical. Thus, to avoid this negative self-view, the person might arrive at the more consistent–and seemingly less biased–argument that science is impotent to address a variety of topics, one of which happens to be the topic in question.

To test these hypotheses, basically Munro had a group of students recruited for his study read various abstracts (created by investigators) that confirmed or challenged their beliefs regarding homosexuality and whether homosexuality predisposes to mental illness. It turned out that those who read belief-challenging abstracts were more prone to use the scientific impotence excuse, while those who read belief-confirming abstracts were not less likely to subscribe to the scientific impotence excuse. Controls that substituted other terms for “homosexual” demonstrated that it was the belief-confirming nature of the abstracts that was associated with use of the scientific impotence excuse. A second study followed up and examined more subjects. The methodology was the same as the first study, except that there were additional measures performed to see if exposure to belief-disconfirming abstracts was associated with generalization of belief in scientific impotence. In essence, Munro found that, relative to those reading belief-confirming evidence, participants reading belief-disconfirming evidence indicated more belief that the topic could not be studied scientifically and more belief that a series of other unrelated topics also could not be studied scientifically. He concluded that being presented with belief-disconfirming scientific evidence may lead to an erosion of belief in the efficacy of scientific methods, also noting:

A number of scientific issues (e.g., global warming, evolution, stem-cell research) have extended beyond the scientific laboratories and academic journals and into the cultural consciousness. Because of their divisive and politicized nature, scientific conclusions that might inform these issues are often met with resistance by partisans on one side or the other. That is, when one has strong beliefs about such topics, scientific conclusions that are inconsistent with the beliefs may have no impact in altering those beliefs. In fact, scientific conclusions that are inconsistent with strong beliefs may even reduce one’s confidence in the scientific process more generally. Thus, in addition to the ongoing focus on creating and improving techniques that would improve understanding of the scientific process among schoolchildren, college students, and the general population, some attention should also be given to understanding how misconceptions about science are the result of belief-resistance processes and developing techniques that might short-circuit these processes.

On a strictly anecdotal level, I’ve seen this time and time again in the alt-med movement. A particularly good example is homeopathy. How many times have we seen homeopaths, when confronted with scientific evidence finding that their magic water is no more effective at anything than a placebo, claiming that their magic can’t be evaluated by randomized, double-blind clinical trials (RCTs). The excuses are legion: RCTs are too regimented; they don’t take into account the “individualization” of homeopathic treatment; unblinded “pragmatic” trials are better; or anecdotal evidence trumps RCT evidence. Believers in alt-med then often generalize this scientific impotence discounting to many other areas of woo, claiming, for example, that science can’t adequately measure that magical mystical life energy field known as qi or even, most incredibly, that subjecting their woo to science will guarantee it to fail. Unfortunately, when science is discounted this way, it allows believers in pseudoscience to dismiss science as “just another religion.” A good rule of thumb is that you see such a dismissal, you know you’re dealing with belief, and not science.

Sadly, though, even physicians ostensibly dedicated to science-based medicine all too easily fall prey to this fallacy, although they usually don’t dismiss science as being inadequate or unable to study the question in question. Rather, they wield their preexisting beleif systems and mental frameworks like a talisman to protect them from having to let disconfirming data force them to change their beliefs. Alternatively, they dismiss science itself as “just another belief.” Perhaps the most egregious example I’ve seen of this in a long time occurred, not surprisingly, over the mammogram debate from six months ago, when Dr. John Lewin, a breast imaging specialist from Diversified Radiology of Colorado and medical director of the Rose Breast Center in Denver, so infamously said, “Just the way there are Democrats and Republicans, there are people who are against mammography. They aren’t evil people. They really believe that mammography is not as important.”

Despite the ‘nym, Orac is actually human. I get it. I get how hard it is to change one’s views. I even understand the tendency to dismiss disconfirming evidence. What I like to think distinguishes me from pseudoscientists is that I do change my mind on scientific issues as the evidence merits. Perhaps the best example of this is the aforementioned USPSTF mammography screening kerfuffle. For the longest time, I bought into the idea that screening was almost completely a universal good. Then, over the last two or three years, I’ve become increasingly aware of the problem of lead time and length bias, the Will Rogers Effect, and overdiagnosis. This has led me to adjust my views about screening mammography. I haven’t adjusted them all the way to the USPSTF recommendations, but I am much more open to changes in the guidelines published late last year, even to the point that encountering the resistance of my colleagues led me to feel as though I were an anomaly.

Skepticism and science are hard in that they tend to go against some of the most deeply ingrained human traits there are, in particular the need for certainty and an intolerance of ambiguity. Also in play is our tendency to cling to our beliefs, no matter what, as though having to change them somehow devalues or dishonors us. Skepticism, critical thinking, and science can help us overcome these tendencies, but it’s difficult. In the end, though, we need to strive to live up to the immortal words of Tim Minchin when describing how he’d change his mind about even homeopathy if presented with adequate evidence (I know I cited this fairly recently, but it’s worth citing one more time):

Science adjusts its beliefs based on what’s observed
Faith is the denial of observation so that Belief can be preserved.
If you show me that, say, homeopathy works,
Then I will change my mind
I’ll spin on a fucking dime
I’ll be embarrassed as hell,
But I will run through the streets yelling
It’s a miracle! Take physics and bin it!
Water has memory!
And while it’s memory of a long lost drop of onion juice is Infinite
It somehow forgets all the poo it’s had in it!

You show me that it works and how it works
And when I’ve recovered from the shock
I will take a compass and carve “Fancy that!” on the side of my cock.”

Actually, as I’ve said before, I’d probably leave out the genital self-mutilation. (Wait! Scratch the word “probably.” I’m rather attached to my male parts, and, like most men, I don’t like the idea of sharp objects being anywhere near them.) As much as I like his “nine minute beat poem,” Minchin may be a bit too flippant about the difficulty in changing one’s mind. Even so, show me, for example, strong evidence that vaccines are associated with autistic regression, and I might not spin on a dime, but eventually, if the evidence is of a quality and quantity to cast serious doubt on the existing scientific evidence that does not support a vaccine-autism link, I will adjust my views to fit the evidence and science.

That’s just what it takes. No one said it would be easy, but the rewards of living in reality make it worth the struggle against our own human nature. In the end, I want knowledge, and science is the best way to get it about the natural world. Certainty is nice, but I can live without it.

REFERENCE:

Munro, G. (2010). The Scientific Impotence Excuse:�Discounting Belief-Threatening Scientific Abstracts Journal of Applied Social Psychology, 40 (3), 579-600 DOI: 10.1111/j.1559-1816.2010.00588.x

By Orac

Orac is the nom de blog of a humble surgeon/scientist who has an ego just big enough to delude himself that someone, somewhere might actually give a rodent's posterior about his copious verbal meanderings, but just barely small enough to admit to himself that few probably will. That surgeon is otherwise known as David Gorski.

That this particular surgeon has chosen his nom de blog based on a rather cranky and arrogant computer shaped like a clear box of blinking lights that he originally encountered when he became a fan of a 35 year old British SF television show whose special effects were renowned for their BBC/Doctor Who-style low budget look, but whose stories nonetheless resulted in some of the best, most innovative science fiction ever televised, should tell you nearly all that you need to know about Orac. (That, and the length of the preceding sentence.)

DISCLAIMER:: The various written meanderings here are the opinions of Orac and Orac alone, written on his own time. They should never be construed as representing the opinions of any other person or entity, especially Orac's cancer center, department of surgery, medical school, or university. Also note that Orac is nonpartisan; he is more than willing to criticize the statements of anyone, regardless of of political leanings, if that anyone advocates pseudoscience or quackery. Finally, medical commentary is not to be construed in any way as medical advice.

To contact Orac: [email protected]

43 replies on “Knowledge versus certainty in skepticism, medicine, and science”

I believe that is the one thing that separates “us” from “them”. “They” will not change their mind about something no matter how much evidence to the contrary there exists. (I’m not even going to touch on “absence of evidence” because your other readers will tear me a new one.) I’ve said it before, and I’ll say it again, “truth does not negate truth”. But the truth does negate lies and myths. And I’m okay with that. “We” are okay with that. “They” are not.

Then again, sometimes knowledge is not enough, for even the Jedi know the difference between knowledge (pause for dramatic effect) and wisdom.

DISCLAIMER: My opinion is as much the opinion of those I work for (or with) as is my choice of toilet paper. That is, it’s not.

Yeah, the complete unwillingness of people to consider that they might be wrong has got to be the biggest hurdle in science, but it’s completely understandable. It’s not always an easy mindset to have.

Travelling back through the mists of time,our earliest home-grown scientific psychologist, laboratory founder,and philospher,William James,poetically describes “The Tender-minded.Rationalistic (going by ‘principles’), Intellectualistic, Idealistic, Optimistic, Religious, Free-willist, Monistic, Dogmatical” and “The Tough-minded. Empiricist(going by ‘facts’),Sensationalistic, Materialistic, Pessimistic,Irreligious, Fatalistic,Pluralistic,Sceptical”…”The tough think of the tender as sentimentalists and soft-heads.The tender feel the tough to be unrefined, callous, and brutal.Their mutual reaction is very much like that that takes place when Bostonian tourists mingle with a population like that of Cripple Creek.”( from “Pragmatism”, 1907).Sounds familiar.

A strategy to introduce new guidelines may be a trickle stream of new data, released carefully without drawing conclusions, so people will come to the conclusion themselves, before being told what to think.

You can have all the supporting evidence in the world, but if that comes after a conclusion that is so alien to you, you don’t want it to be true, it loses much of its persuasive power.

I’ve been saying this all along–though far less eloquently and without benefit of confirming evidence. Question is: Is there anything to be done? I’ve suggested in past posts that only a complete revamping of science education at every level will be able to make a dent in the way that children develop their approach to science. For me, although I didn’t study a lot of hard science in college, I did get a big exposure to the HISTORY OF SCIENCE through my major in Anthropology which exposed me to what science has been up against throughout history. I went to college later than the traditional student, so was perhaps less distracted by the social aspects and so it turned out to be a major turning point for me. I used to fall in line with all kinds of woo and would try anything on a lark, but I became quite the crusader of science after college and continue to fight the battle.

Thanks for the thorough and informative post. It’s great to have your vague thoughts confirmed in this way. I’m going to print it and give a copy to the vet’s assistant who got very upset when I rejected her wooey advice on pet food. She kept on about “everyone’s entitled to her opinion” every time I said I was only interested in the science, not her personal perspective or experience. When I make it clear I only wanted feeding advice from the vet, she said I didn’t need to be RUDE to her just because she was trying to help me. She absolutely could NOT take in the idea that I was separating science from belief.

I’ve suggested in past posts that only a complete revamping of science education at every level will be able to make a dent in the way that children develop their approach to science

I disagree. I don’t think it needs to be “revamped.”

What needs to be taught better is how we fool ourselves. A better understanding of cognitive psychology would make the difference. More need to understand that their experiences don’t mean everything, and can deceive them. Material like what you find in Shermer’s Why We Believe Wierd Things or the more academic How We Know What Isn’t So by Tom Gilovich. These teach people a) that they make mistakes in conclusions, b) why they make those mistakes, and c) how to avoid those mistakes.

That is the part that is missed.

In your article, you write, “when people see studies that confirm their beliefs, they tend to view them as unbiased and well-designed, while if a study’s conclusions contradict a person’s beliefs that person is likely to see the study as biased or poorly done”.

Playing Devil’s advocate here, could this argument not be extended to the Pro-Vax movement? An anti-vaxxer could say “well, according to the above principle, the pro-vaxers are taking the ‘fourteen studies’ (or whatever other studies/science you want to use) and are not seeing the bias/omissions/etc?

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

A patient came in to see a doctor who asked: “what’s wrong?” patient replied: “I’m dead”. Doctor said: “I have been practicing medicine for twenty years, graduated top of my class in medical school and I assure you…you are not dead, you’re alive”. patient: “no doctor, I’m dead”. After a long discussion that fails to convince the patient the doctor asks: “do dead men bleed?” patient says: “absolutely NOT”. Doctor takes out a needle and pricks patient in the arm which begins to bleed. Doctor says: “SEE”. Patient responds: “Wow…dead men do bleed”.

Playing Devil’s advocate here, could this argument not be extended to the Pro-Vax movement? An anti-vaxxer could say “well, according to the above principle, the pro-vaxers are taking the ‘fourteen studies’ (or whatever other studies/science you want to use) and are not seeing the bias/omissions/etc?

Absolutely. I’ve gotten into trouble here in the past suggesting that studies that should not be considered as anything were incorrectly being construed as pro-vax evidence. I didn’t even suggest that they supported an anti-vax view, just that the conclusion that they supported a pro-vax view was too far.

I’ve pretty much given up on trying to have any kind of discussion with people about any of their various beliefs in woo topics. I learned the hard way. I had finally gotten to the point of asking early on in a discussion, “Well, what kind of evidence can you imagine that would lead you to doubt your current conclusion?”. Typically the response was the “deer in the headlights” look, which meant, “there couldn’t be any such evidence because I already know it’s true”. End of discussion. I’m not very hopeful that any significant portion of the human race is going to overcome these tendencies any time soon. Sigh.

I’ve pretty much given up on trying to have any kind of discussion with people about any of their various beliefs in woo topics.

Things to avoid at the Thanksgiving dinner table: religion, politics, and woo.

Doesn’t leave much. Good thing there’s football on TV.

Since most people are smarter than average, why should they listen to someone who disagrees with them?

Things to avoid at the Thanksgiving dinner table: religion, politics, and woo.

It’s kind of hard to do when you say grace, complaint about taxes, and your grandmother gives you her latest remedy for something.

DISCLAIMER: Standard disclaimer applies.

There’s an additional reason why truth doesn’t win out, communal reinforcement. Basically, belief is often a requirement for membership in a tribe, and people don’t want to be kicked out of the social group. I think that’s extremely powerful in a lot of these cases.

I agree with Pablo at #7 with an addition: we should teach how we make errors when thinking, and how to avoid them … and then we would be teaching scientific thinking.

As to certainty, a point I like to make when talking to an altie is that our lives are not experiments. There are no control selfs for us to compare. We can’t even be sure if the antibiotic given by our doctor is really working, or it’s our own inmune system doing the job.

The point, of course, is that every study about how that antibiotic works, how useful it is as a treatment, whate it’s side effects are, reduces our uncertainty about the outcome of taking it. And the same applies to CAM, though usually towards the prediction that it won’t or it can’t work.

Thanks Orac for a wonderful post.

Juan Pablo – Note how I said, “These teach people a) that they make mistakes in conclusions, b) why they make those mistakes, and c) how to avoid those mistakes.”

But in other matters…When I saw the title for this, the first thing I thought about was religion, and the religious who claim they “know” God exists. Clearly using “know” in a way that is not consistent with any technical use of the word.

Even more worrisome, however, is the possibility that scientific impotence discounting might generalize beyond a specific topic to which a person has strong beliefs. In other words, once a person engages in the scientific impotence discounting process, does this erode the belief that scientific methods can answer any question? From the standpoint of the theory of cognitive dissonance (Festinger, 1957), the answer to this question could very well be “Yes.”

Probably the largest area which is presumed to be immune to scientific or rational investigation is … religion. It puts forth hypotheses, implies predictions, makes truth claims, and yet, somehow, science is “impotent” before it. The facts are known to be true in some other way: personal experience, intuition, special revelation, the inner light of a guiding preference, convenience, comfort, and familiarity. The term “naive realism” applies here: other people’s faith is unreliable, but mine can be trusted because it’s true.

Alternative medicine — or woo in general — is simply religion in another form: vitalism, intentional force, mind/body dualism, and Nature as inherently moral. So the scientific impotence discounting process is easily co-opted from an already acceptable source. I mean, they already know that there are truths that science can’t touch. Important truths. Facts that matter, like souls living on after death, or gods come to earth doing miracles.

Over and over again, my questions and concerns about alternative medicine have been met with puzzled stares and some hesitant questions about my religious beliefs. “But … don’t you believe in God?” I haven’t drawn the connection — they do. If I answer honestly — no — they breath sighs of relief, laugh, and dismiss my scientific points. Oh. Okay. I’m one of those people — ones who haven’t figured out that science can’t tell us everything. There are truths science can’t get at. Like God. And reiki.

I think that the so-called ‘new atheism’ one of the allies of science-based medicine — and this is true regardless of whether scientists believe in God or not, and regardless of whether the new atheists are right. They’re trying to break down the assumption that science is impotent before the claims of religion. The common belief that it is, is generalized to woo and alt med, and then on to global warming and what have you. If there is a crack of humility let into religion — less smug certainty about those “other ways of knowing” — it may have a trickle-down effect.

Hi Orac,

terrasig suggested you do a followup article on dichloroacetate (DCA) given the paper just published on the phase 1 trial in Edmonton.

Three years have passed and countless cancer patients were denied this drug. Now at the end of its first phase one trial we know exactly what we did from the reports of people self-medicating in 2007 before the FDA forced it off the market – it shows great promise.

Explain to me again how these controlled trials are oh-so-much better than the ad hoc trial of self-organized self-medicants? Lay that old woo stick on me again, buddy.

…DaveScot

Actually, if you actually bothered to read Abel’s post, you’d realize that, at best, DCA shows modest promise against glioblastoma. I finally got a hold of the paper today, thanks to a reader. (My institution doesn’t subscribe to the journal in which it appeared.) Unlike some, I don’t blog about something like this without actually reading the scientific paper first. Certainly I don’t rely on breathless news reports. Breathless news hype is what helped create the DCA hysteria that misled desperate cancer patients in the first place.

Perusing the paper, at the risk of spoiling Monday’s post I’ll just say that I think I mostly agree with Abel’s assessment. It shows modest promise in glioblastoma, but just modest promise–which is pretty much what I said three years ago.

I ran into the “science is religion” argument while debating the merits of anthropogenic climate change with my own father. How does one argue with a viewpoint like that. First he called it “junk science.” I then asked him exactly why he called it junk science; was he taking issue with its testing methods, how the data is correlated and studied, whether they have come to faulty conclusions when reviewing the data, what exactly in the science of climatology was his problem? No answer. He then responded by telling me that the science was disputed. I of course pointed out that science is always in dispute, that’s exactly why it’s science; any scientific topic is open to discussion, revision, dispution, and updating or clarification if new evidence makes itself apparent. Saying science is in dispute is like saying water is wet, it’s part of the definition of what makes science science. It was at this point he pulled the “global warming is religion” card. I tried pointing out the differences between knowledge gained in a scientific manner versus science gained in a religious or faith-based manner. I tried pointing out that there’s a difference between trusting the conclusions of an overwhelming number of scientists highly knowledgeable in the field they were making conclusions about, versus trusting in priests, rabbies, or witch doctors.

All to no avail. Really, I wonder if there’s literally no reaching some people…

dogmatichaos said:

All to no avail. Really, I wonder if there’s literally no reaching some people…

As unfortunate as it may be the old cliche is often true…
“You can’t reason someone out of a position that they didn’t reason themselves into.”

Most such beliefs usually have a powerful emotional component tied to them that maintains the refusal to accept the science. There are of course situations where it may be possible to “reach” someone, but it’s usually not going to be possible in the short term even for a single individual, let alone a community.

Anti-vaccination has a long shared history with anti-vivisection.

In the UK we have an organisation called the Dr Hadwen Trust which funds non-animal methods of medical research/testing. This research funding provides them with a cloak of scientific respectability with which to pursue their primary antiviv agenda.

The eponymous doctor was an early twentieth century anti-vacc/viv who tried to trash the germ theory of disease.

The current issue of New Scientist has an interesting article on the subject of denialism.
http://www.newscientist.com/article/mg20627605.900-special-report-living-in-denial.html

All I know is that in order to get a chance at having a job I’ll have to undertake a Rorschach test among other crap on Monday.

Most psychiatrists still use it even though it’s bullshit on every objective and predictive level.

Doctors (MD) are the worst scum on the earth when it comes to not changing their opinions even when faced with huge amounts on contrary evidence.

The person who wrote this is just another one of those deluded fools as this text clearly shows.

“The opposite of faith is not doubt: It is certainty.”…

I’ve heard this said by a jesuit priest who also happens to be a member of the vatican observatory and is an astronomer. He was just repeating it. And the author of that statement is closer in practice to what scientific research is trying to achieve, or to quote Anne Lammott“We’ve got people to feed, people who have run out of hope, and we have the Earth to save and a future to plant for our children. We are like the people in Jeremiah, in the First Testament, “people standing in the rubble of a once great community.””

Although Krista Tippet annoys me with her ad hominen attacks on Dawkins, her interview with these to scientists was pretty awesome, in part due to these statements.
“Br. Consolmagno: Oh, well, exactly. I keep going back to this wonderful phrase that Anne Lamott came out with a few years ago: “The opposite of faith is not doubt. The opposite of faith is certainty.” If you’re sure about something then you don’t need faith”

And

Fr. Coyne: But the last thing we want to do is make God a mathematician.

Ms. Tippett: OK.

Fr. Coyne: I mean, that’s even worse than making God an engineer, like the Intelligent Design movement does. Right?

Ms. Tippett: OK. Tell me what you mean by that.

Fr. Coyne: I mean, God is a god of love. Mathematics is not the language of love.

Br. Consolmagno: Well, depends if you’re a mathematician or not. I’ll put it in a different way. When I was a little kid, nine years old, I remember a rainy Sunday afternoon and you couldn’t go out to play and you were stuck in the house. And my mom came out with a deck of cards and dealt them out and we played rummy together. Now, my mom can beat me in cards because I’m nine years old. That wasn’t the point of the game. The point of the game was this was her way of telling me she loved me, in a way that she couldn’t just say, you know, “Son, I love you,” because I’m nine years old. I’m going to squirm and go, “Aw, Mom,” and run away. In a way, being able to do science and come to an intimate knowledge of creation is God’s way of playing with us. And it’s that kind of play that is one way that God tells us how he loves us. So is it invented? It’s as invented as the card game. But is it an act of love? It’s as much an act of love as the card game.

Ms. Tippett: OK.

Fr. Coyne: I like that. Playing games with God. Or God playing games with us. That’s true. Made a universe that has that fascinating attraction about it. Which, doing science to me is a search for God and I’ll never have the final answers because the universe participates in the mystery of God. If we knew it all, I’d sit under a palm tree with my gin and tonic and just let the world go by.

and

Br. Consolmagno: Well, certainly, whoever’s responsible for this universe has a great sense of humor because whenever you’re expecting something you get what you expect but from a very, very different angle than the way you were expecting it. You know, the center of all humor. We are constantly being surprised and delighted by the surprise. Also, a creator who loves beauty. It’s not enough that the universe makes sense and we can come up with equations for them, but the equations themselves are beautiful.

I can remember, I was teaching a few years ago and Fordham University. I was teaching Maxwell’s equations, and there’s a great thing where suddenly you can use Maxwell’s equations — Maxwell did this — to come up with the fact that electricity and magnetism acts like a wave that moves at the speed of light.

Ms. Tippett: Right.

Br. Consolmagno: And it’s light waves and that’s what makes radio possible. And I remember getting to the point where I had just written that equation down when a student in the front of the class goes, “Oh, my god. It’s a wave.” And he also had gotten that sense of, “This is beautiful. This is wonderful.” It’s such a surprise.

(Sound bite of music)

Ms. Tippett: Father George, it seems to me that it feels important to you in your writing to stress that science and religion are separate pursuits and that science, in fact, is neutral. It doesn’t have theistic or atheistic implications in and of itself.

Fr. Coyne: Correct. My take on the relationship, my personal life, OK, is built upon the following: I’m a scientist. I try and understand the universe. My understanding of the universe does not need God.

Ms. Tippett: I think you’re also suggesting — you also suggest that to talk about God in that way, in some sense, is to diminish God and also to diminish the capacity of human intelligence that drive science, that is connected with God in your mind.

Fr. Coyne: That’s very true. I think to drag God in when we find that our science is inadequate to understanding certain events that we observe in the universe, we tend to want to bring in God as a god of explanation, a god of the gaps, OK? And we constantly do that. Newton did it, you know? If we’re religious believers we’re constantly tempted to do that. And every time we do it, we’re diminishing God and we’re diminishing science. Every time we do it.

Br. Consolmagno: What you wind up doing is turning God into a pagan god, you know, god of thunder, god of lightning, god of crops. And the Romans thought the Christians were atheists because they refused to believe in that kind of god.

I believe in the power and honesty, beauty and power of scientific research, and that god can be summed up as love. I have faith in both.

Much of quackery involves substituting the certainty of belief for the provisional nature of science.

Huh? When I suggest that climate science at the moment is nowhere near capable of making certain predictions, I am a “denier”. I cite the provisional nature of the current science, and am told it is “settled”.

Does anyone here actually believe we understand the climate at all well?

In which case, why is this particular area treated differently?

(I know the answer btw. Some things are wrong because they are politically wrong, not because they are scientifically wrong. Despite me being a life-long Labour voter, apparently I am a right-wing dupe.)

@ Mooloo:
Respectfully, some parts of climate science are provisional, others are settled. Just because we don’t know everything doesn’t mean that we don’t know somethings, nor does it mean that we can’t make general predictions. For example just because we don’t know exactly how a bacteria might adapt to a specific anti-biotic doesn’t mean can’t make a general prediction about what will happen if we’re not careful with the use of that antibiotic.

Even if you leave out the provisional parts of climate science the evidence is still overwhelmingly in favor of AGW being real. That is why the scientific consensus has come down clearly on the side that AGW is real and a potential problem.

The only real provisional part at this point (as far as AGW is concerned) is exactly how much of a problem it’s going to be, and what can (or can’t) be done to minimize any potential harm.

@Mooloo
All science is provisional. By this we mean that new evidence may necessitate modification of, or may even break, a theory which makes sense of old evidence. For example, Europe’s Large Hadron Collider may produce new evidence which could reinforce or break the current theory describing the nature of matter at the smallest scale – the “standard model” of particle physics.

“Settled” in this context doesn’t mean complete and certain in every detail for all eternity. It means general confidence amongst climate scientists that humans are causing climate change on a scale and in a manner which urgently needs to be addressed, because of the potential *uncertain* future consequences.

We may or may not be very far away from a possiby avoidable catastrophe. Either we act or we cross our fingers and pray.

Just in case it wasn’t clear, my previous comment was meant as a reminder of the importance of countering the denial of the usefulness of animal research in advancing medical science.

I hate to say it but there is a money motivation to keep the screening rules as they are. Those tests cost money and the doctors who are accepting or rejecting the recommendations can see a drop in income if they essentially cut their patient visits in half.

Three years have passed and countless cancer patients were denied this drug. Now at the end of its first phase one trial we know exactly what we did from the reports of people self-medicating in 2007 before the FDA forced it off the market – it shows great promise.

For example just because we don’t know exactly how a bacteria might adapt to a specific anti-biotic doesn’t mean can’t make a general prediction about what will happen if we’re not careful with the use of that antibiotic.!!

Comments are closed.

Discover more from RESPECTFUL INSOLENCE

Subscribe now to keep reading and get access to the full archive.

Continue reading