Mar 15 2007

An Intercessory Prayer Hodge Podge

A newly published meta-analysis of 17 published studies looking at the efficacy of intercessory prayer claims that it demonstrates a positive effect. David R. Hodge, an assistant professor of social work in the College of Human Services at Arizona State University’s West campus, performed the meta-analysis and concluded, “Using this procedure, we find that prayer offered on behalf of another yields positive results.” The study is a good example of how not to use meta-analysis.

A meta-analysis is the process of combining the results of different studies and then performing statistical analysis on the combined data. The benefit of this procedure is that it affords greater statistical power to the data – so that effects that are too small to see with one study can be demonstrated to statistical significance.

But meta-analysis is tricky business. First, it should be pointed out that it does not represent new data – it is just taking a fresh look at old data. It can be useful but only when it is very carefully applied. For example, the studies that are lumped together should have very similar design, they should be looking at the same type of subjects and should use similar outcome measures. The results of a meta-analysis are only meaningful if data from the different studies can be reasonably combined.

Also, a meta-analysis does nothing to address the quality of the studies being looked at. The old adage of “garbage in-garbage out” still applies. If you lump together 10 bad studies, you don’t get one good study, you get a useless meta-analysis. For these reasons meta-analyses have a poor track record of predicting the ultimate outcome of a question, once definitive studies are done, failing over a third of the time.

A meta-analysis is not always the best method for coming to an overall conclusion about an area of research. There are aspects to the pattern of results that are important to consider, and are white-washed in a meta-analysis. For example, what is the trend between study quality and size of the effect? If we see a real tendency for the better studies to have a smaller effect (which we do, in my opinion, in the intercessory prayer literature), that strongly suggests that the effect is not real. By combining these studies, however, these differences are erased. In effect, the good data is diluted in bad data.

We can further ask if the kinds of results are in agreement and make sense. There have been several studies of intercessory prayer in cardiac bypass patients, for example. The first few looked at multiple outcomes, such as length of stay in the cardiac critical care unit, complications, and overall health rating – with some but not others showing a benefit for prayer. However, the outcomes that were better on one study were different than those on another study. So taken as a whole, the studies contradict and somewhat neutralize each other. A third, larger, better designed, more definitive trial (the STEP trial) was done. This study was solidly negative. This is a good example of how clinical literature evolves – newer studies build on the lessons of older studies while trying to resolve enduring conflicts. Again, taken as a whole this progression of research shows no effect for intercessory prayer in cardiac bypass. It is completely inappropriate to combine the data in a meta-analysis and erase a more meaningful analysis that comes from seeing how the research evolves. Hodge, in fact, did not keep the separate outcome measures separate – he combined them into one total score for each study – eliminating the factor of consistency among studies.

Hodge also included at least one positive study that is extremely suspect – the Wirth in-vitro fertilization study. After publication it was revealed that researcher Daniel Wirth was previously convicted of fraud, has used a pseudonym in the past to obtain a passport, and is the author of many prior studies claiming miraculous results. The study itself has been thoroughly discredited. This is a classic example of garbage in-garbage out. The inclusion of this data in the meta-analysis by itself invalidates the results.

As others have pointed out the introduction of magic in general, and faith healing in particular, into modern scientific medicine should not be viewed as benign or harmless. It is often defended with the tired and lazy notion of “it couldn’t hurt, so why not give it a try.” But the introduction of supernatural forces into mainstream medicine can have an insidiously destructive effect, eroding the scientific and intellectual quality and integrity of the culture of health care.

This meta-analysis by Hodge is scientifically worthless and publicly misleading. I await the latest round of news stories saying that “A new study finds that prayer works,” even though it is not a new study (meaning there is no new data) and the entire approach has serious flaws. The later more detailed analysis that is sure to come will be largely overlooked. Maybe I will be pleasantly surprised, but I doubt it. For those of us interested in scientific integrity, the best we can do is damage control.

No responses yet