Jul 03 2007
The interpretation of clinical trials is very complex, even for practicing physicians who have to base medical decisions on such evidence. For the public, who rely heavily on media-filtered information about clinical research, the task is almost impossible. The result is general confusion.
The story of Echinacea, and whether or not it is useful for the treatment of the common cold, is a good example of the complexities involved – especially when dealing with an unregulated product marketed directly to the consumer. Echinacea has been marketed for decades as a cold remedy. Early clinical trials showed mixed results, but proponents cherry picked the positive studies to promote Echinacea to the public. Then a series of well designed trials all showed no effect, and it seemed that Echinacea would be relegated forever to the fringes of CAM.
Now, a recent meta-analysis of 14 studies involving the use of Echinacea for the treatment of the common cold found that “Published evidence supports echinacea’s benefit in decreasing the incidence and duration of the common cold.” The press is promoting this as a new “study” that shows that Echinacea works.
The Echinacea story reflects the basic misrepresentation of clinical evidence to the public, and even to professionals. Here are some basic principles to keep in mind when evaluating such evidence.
Confusion of Basic and Clinical Research
Medical research is divided into basic science ( test tube and animal studies), and clinical research (studies on people). Often the purveyors of drugs, supplements, and all variety of medical devices and products will make clinical claims based upon basic science evidence. This is disingenuous and misleading.
For example, with Echinacea, there are studies showing that certain chemicals in certain preparations of certain parts of the Echinacea plant have affects on certain parts of the immune system, like T-cells and phagocytes. Echinacea promoters then extrapolate from these test-tube effects to the clinical claim that Echinacea “boosts immune system function” and then further from that to the claim that it helps the common cold.
But serious medical researchers know this to be utter folly. It is difficult to impossible to extrapolate from chemical or cellular reactions in a test tube to a net medical effect on an entire organism. What we think should happen based upon basic science information rarely predicts what exactly happens in people. Basic science should be used to guide clinical research, but it cannot replace and never should trump clinical research.
So rule #1 is to beware clinical claims based upon basic science alone.
Reliance on individual studies
The medical research is a complex evolving beast, and to truly understand if a treatment works one must look not only at plausibility but all the clinical research to see what the overall pattern is. Individual studies are quirky, often biased or flawed, and never tell the whole story. It takes time – years – for a research question to work itself out to a reasonable degree of reliability.
Rule #2 – beware preliminary research, or conclusions based upon 1 or a few studies.
Confusing different types of “studies”
Often, the media will report anything published as a “study,” as they did in this case with the recent meta-analysis of Echinacea, as if it is new data or data that bears directly on efficacy. But a study could be a survey or it could be just a reanalysis of old data – as in this case. The meta-analysis did not present any new evidence for Echinacea. Calling it a “study” is misleading.
Rule #3 – Know what kind of study is being referred to.
A meta-analysis takes several previous studies and then analyzes the data lumped together. It is a very problematic and complex analysis to do, and is often misused. The greatest weakness of a meta-analysis is that it lumps together different studies with different methods and outcome measures. There is also great risk of selection bias in which studies to analyze.
A meta-analysis misses what I think is a critical aspect of an evolving research literature on a question – how does the quality of the study relate to the size of an effect, if any? Also, do the various results present a consistent picture, or do they contradict each other.
For a large and complex literature, a systematic review is a much better tool for analyzing the research. Systematic reviews look at all studies on a question and look to see what the overall pattern is – what do the best studies show, how consistent are they, what was the effect on outcome of fixing criticisms of the early research, etc.
Rule #4 – Beware the meta-analysis
Anecdotes and Testimonies
Do I even need to go into this again?
Rule #5 – Beware anecdotes and testimonies
Back to Echinacea
With all this in mind, let’s take another look at the Echinacea question. The plausibility of Echinacea is low but not zero. Unlike homeopathy, herbs contain actual chemicals that can have a pharmacological effect in the body. Herbs are drugs. But treating the common cold has a low probability. The cold is caused by a large number of viruses that are constantly evolving. There is no silver bullet that makes the immune system function better than it evolved to function against viruses, at least not that anyone has credibly found so far.
The basic science findings of Echinacea are not very compelling. They are the types of non-specific changes that can be found with hundreds of compounds, and basically just mean that the immune system is reacting to a foreign agent.
So the basic science is unconvincing and the prior probability is low but not zero. What about the clinical evidence?
I think this Cochrane Review is a much better assessment of the evidence than any meta-analysis. It concludes that the data is inconsistent and there is insufficient evidence to conclude that Echinacea works for the common cold.
I do not think that this current meta-analysis adds anything to the body of evidence on Echinacea, but it will serve to confuse the public on the issue. The herbalist community has happily accepted the results and will contribute to the confusion.
Interestingly, on one herbal site I found a cogent description of the complexity of analyzing herbal remedies – and why I think they are dirty drugs that are not very useful. The site was using this, however, as a way of dismissing negative evidence, but the same arguments could be used to dismiss positive studies, and this meta-analysis.
“The assessment of echinacea’s effectiveness is complicated for several reasons: (1) 3 different species (Echinacea angustifolia, E. purpurea, and E. pallida) are used medicinally; (2) different parts of the plant (root, herb, flower, or whole plant) are used in various preparations; (3) various preparation methods are used in production (dried herb material, extraction, fresh-pressed juice, etc.); (4) some echinacea preparations contain combinations of species, plant parts and/or types of preparation methods; and (5) some echinacea-based products also contain other plant extracts or (in other countries) homeopathic components.”
Well, I don’t think the addition of homeopathic components makes any difference, because homeopathy is just water, but the other points are valid. The real question is, how is this information used. Promoters used these points to say that any study that shows that Echinacea does not work simply studied the wrong species, or the wrong preparation of the wrong part of the plant. Such excuses can be endless.
It is precisely why herbal remedies are so problematic. We should be isolating the chemicals, studying their effects, purifying them, and then giving purified substance in known doses to test subjects in clinical trials to assess safety and efficacy. Otherwise we are stuck on the endless herbal merry-go-round.
My assessment of all the evidence is that Echinacea probably does not work for the common cold, or anything for which it is currently claimed. It also just as likely contains substances that if isolated and purified might have interesting and useful pharmacological applications. CAM proponents have been very successful, however, in using politics and propaganda to generate a tremendous amount of confusion and nonsense, the result of which is a great deal of wasted research effort asking the wrong questions or performing useless studies that do not have the potential to settle debate.
Leave a Reply
You must be logged in to post a comment.