Science-based medicine and what patients want

I’ve frequently discussed the difference between what has come to be known as “evidence-based medicine” (EBM) and “science-based medicine” (SBM). Basically, SBM is EBM in which prior probability and plausibility of proposed medical and surgical therapies are considered along with clinical trial evidence. I don’t plan on getting into that specific issue in detail right here. Rather, I bring it up because the best medicine is based on science and evidence. However, I’ve also pointed out that medicine, while it should be science-based, is not and can never be, strictly speaking, a science. (Look for a quack to quote mine that; it’s coming.) The reason is that we in medicine must take into account other things besides just scientific evidence when we decide what treatments we recommend to our patients. That’s why I found a recent survey about patient attitudes toward health care that I learned about from the Health Affairs blog quite fascinating. The report is entitled Communicating with patients on health care evidence, which is described thusly on the blog:

But as we have learned in the years since, one person’s evidence-based guideline is another person’s cookbook. For some, a sound body of evidence is fundamental to sound medical decisions. After all, as Jack Wennberg and Dartmouth researchers have pointed out for decades, if the practice of medicine varies so widely from place to place in this country, everyone can’t be right. Yet for others, evidence connotes not just “cookie-cutter medicine,” it is only one step shy of a trip to the death panel. This heavy baggage influences the way evidence-based medicine is discussed from the doctor’s office to the clinic to Capitol Hill.


All of which is true, but incomplete. The issue is, of course, that there are a variety of nonscientific issues that impact medical care not only for each individual patient, whose values, other health history, and desires are different from every other patient, but for patients in general. Yes, the practice of medicine does vary far more than it should from region to region, but there is no way that there will ever be an iron clad standard that will reduce that variation to a very low level. This is all the more true given that for many medical conditions there are multiple science- and evidence-based treatments that are roughly equally effective and that these are often changing as new scientific evidence comes in. Moreover, these practices can change at different rates in different places depending on local circumstances. One example that I like to use is mastectomy rates for breast cancer. As hard as it is for those of us living in large urban areas to believe, there are huge swaths of this country where a radiation oncology facility could be 50 or more miles away. To do breast conserving therapy (i.e., lumpectomy) without an unacceptably high rate of recurrence in the region of the lumpectomy requires radiation therapy. If going to a radiation therapy facility every day for six or seven weeks (which is what typical radiation therapy regimens for breast cancer entail) is not feasible, then choosing a mastectomy becomes more likely. It’s an entirely rational decision, given that survival will be the same.

None of this is to say that medicine shouldn’t be standardized as much as possible around the treatment modalities that have the highest degree of evidence; rather, it’s simply to say that there’s more than just science involved in the choice of medical care. That’s why we refer to the best medicine as “evidence-based” and “science-based” rather than trying to make medicine into a science. Science informs what we do and arguably should be the dominant determinant of what we do, but other issues can become quite important. That’s why this survey is interesting; it tells us what patients believe to be important. The results, not surprisingly, echo an old Queen song that I happen to like, I Want It All, as evidenced by this figure included in the report:

This was accompanied by this conclusion:

The research revealed the importance of seating more specific campaigns about medical evidence within the context of a clinical encounter that takes into account three vital—and equally important—elements: the expertise of the provider, the medical evidence, and the patient’s preferences (goals and concerns). These three aspects—which we depict as three separate but interlocking circles that, when combined, result in an informed medical decision—were posited to be the best framework for raising awareness about the role and importance of medical evidence for future communication and patient-engagement strategies.

And:

Key themes that emerged from the interviews and focus groups included that people want to be involved in treatment decisions, want their options to be clearly communicated, and expect the truth—the whole truth—about their diagnoses and treatments.

All of which is reasonable on the surface, but how do the pieces fit together?

The authors of the study surveyed 1,068 adults who had seen at least one health care provider in the last 12 months, the majority of whom (88%) saw a physician and were also satisfied with their health care provider (68%). Drilling down right to the “meat” of the survey, this is how the respondents rated the importance of the three main legs of the consent “stool”:

It’s good to learn that patients consider medical evidence to be either important or very important and that they value it at least somewhat more than their own goals and expectations. That’s a good thing. Of course, what patients perceive as being the “best evidence” and what doctors perceive as the best evidence are not necessarily the same thing. Sometimes, they might only be related by coincidence, as I like to put it sometimes. This brings us to the second part of the study, namely what sorts of framing of issues resonates the most with patients. The sorts of phrases that seemed to produce the most confidence in the people surveyed include (percentage represents level of confidence expressed by respondents in the phrase):

  • What is proven to work best (79%)
  • The most up-to-date medical evidence, including information about the risks and benefits, about what works best (76%)
  • Best practices in the medical field (75%)
  • What medical science shows about each option’s benefits and risks (71%)
  • What the research shows (68%)
  • Guidelines developed by national medical experts about what works best (65%)

These are all useful phrases to keep in my back pocket when describing what scientific evidence supports as the standard of care. On the other hand, most physicians, particularly physicians practicing in academic settings, are wired to know and discuss the evidence with patients, at least in my experience. They already probably use variants of these phrases in describing evidence. What would actually be more useful to me as a physician is what sorts of phrases turn patients off from science-based treatments. I can probably guess to some extent. For instance, I doubt that donning a paternalistic “doctor knows best” attitude would be well-received by most patients, but when trying to persuade patients what the evidence shows they should do what would really be more helpful to me is to know how not to inadvertently make it less likely that they will be receptive to my message.

Steve Novella makes another point about this study with respect to “complementary and alternative medicine” (CAM):

Patients, for example, frequently ask me what I think about acupuncture, a particular supplement, or some other “alternative” treatment for their condition. I tell them – without waffling or watering down my opinion. I don’t editorialize or express judgmentalism because that is inappropriate in a therapeutic context, but I tell them my understanding of the published scientific evidence and the plausibility of the treatment, and then give them my bottom-line recommendation. I do this as if I were talking about any treatment option. Even when patients are starting from a very different opinion about the treatment, they appreciate the fact that I have taken the time to look into the research and to communicate that to them.

This is exactly why the “shruggie” approach to unconventional treatments is counterproductive and a huge disservice to patients. Saying, “I don’t know” when asked about an implausible and ineffective treatment, or even giving a dismissive response, is not going to be convincing to patients.

Indeed, I do think that this study suggests that the best way to deal with such inquiries is to be completely frank in as nonjudgmental a way as possible. I admit that there are times when I have failed at this, but in my defense I would counter that the patient was pushing me to tell her whether I thought that a certain “alternative” practitioner that I’ve discussed on this blog before was a quack or not. Finally, I just blurted out that, yes, I think he is a quack. Not particularly “non-judgmental,” but it was what the patient demanded. Be that as it may, my approach is similar to Steve’s in that I don’t pull any punches, but I try not use the “q” word or any term or phrase that comes across as dismissive. Being “insolently” dismissive (respectfully or not) is fine for a blog for education and entertainment of both my readers and myself, but it has no place in one-on-one doctor-patient encounters. Like Steve, I find that most patients who ask actually appreciate that I know about such things and can summarize the evidence.

Of course, the problem is that most doctors don’t know—and therefore can’t summarize—the data for various CAM modalities. For example, in my experience most physicians don’t even know what homeopathy actually is. Like most lay people, they tend to think it’s just “herbal” or “natural” remedies and are often amazed when I explain to them the principles of homeopathy. The point is, if we are to be effective at communicating what is the best science-based therapy to our patients, we need to know not just the evidence for what does work but the evidence for what doesn’t work. More importantly, we need to be able to communicate that evidence.