Feb 14 2011

Reporting Preliminary Studies

A recent study, presented as a poster at the American Stroke Association International Stroke Conference, found a 61% increase in risk of stroke and cardiovascular disease among survey respondents who reported drinking diet soda compared to those who drank no soda.  The study has resulted in a round of reporting from the media, and in turn I have received many questions about the study.

Frequent readers of this blog should have no problem seeing the potential flaws in such a study. First – it is an observational study based upon self-reporting. At best such a study could show correlation, but by itself cannot build a convincing case for causation. Perhaps people who are at higher risk of cardiovascular disease and stroke, for whatever reason, are more likely to choose diet sodas because they are trying to avoid unnecessary calories. Questions that should immediately come to mind – what factors were controlled for and how was the information gathered? According to an ABC report:

The researchers used data obtained though the multi-ethnic, population-based Northern Manhattan Study to examine risk factors for stroke, heart attack and other vascular events such as blood clots in the limbs. While 901 participants reported drinking no soda at the start of the study, 163 said they drank one or more diet sodas per day.

The study also controlled for “smoking, physical activity, alcohol consumption and calories consumed per day.”

Some obvious factors were controlled for, but others were not. For example, there did not appear to be any control for BMI or any measure of fat percentage. This is the most likely confounding factor – overweight people are more likely to drink diet soda and have vascular disease. They did later account for metabolic syndrome, but not weight as an independent variable and not other eating habits. People who drink diet soda may also be doing so to offset less healthy eating habits otherwise.  This in itself makes it impossible to interpret the study.

Further, we have a one-time self report, rather than reporting on soda intake at several points in time. The data itself is not very reliable.

While this study has serious flaws that preclude any confident interpretation, it is a reasonable preliminary study – the kind of study that gets presented as a poster at a meeting, rather than published in a high-impact peer-reviewed journal. Such preliminary research is mostly an exercise in data dredging – looking at data sets for any interesting signals. The purpose of such preliminary research is to determine whether or not more definitive follow up research is worth the time and effort. If there were no signal in this data, then don’t bother designing and executing a tightly controlled several year prospective trial.

Medical science is full of these preliminary studies. They provide the raw material from which large and expensive trials are derived. We also know from reviewing the literature that most of these preliminary studies will turn out to be wrong. The scientific community understands this.

The problem is in the reporting of these studies. The mainstream media probably should just ignore any study that is deemed preliminary, especially if it’s just an isolated study. Perhaps in a thorough feature article it would be reasonable to give an overview of the state of the research into a question, including preliminary studies, because in a feature time can be taken to put the evidence into perspective. But reporting a single preliminary study as science news is a highly problematic approach.

On this item there was a range of reporting, from fear-mongering to reasonable. The ABC report, for example, was very reasonable and included appropriate background information and balanced quotes from critics of the study. But many people reading the report will come away with just the headline: “Diet Soda: Fewer Calories, Greater Stroke Risk?” (other headlines did not even include the question mark). Even those who read the article and get the fact that the conclusions are preliminary and many experts are skeptical – three months from now they are likely to just remember the association between diet soda and stroke risk, and not the fact that the association is likely not true.

Over-reporting of preliminary results also has the effect of confusing the public with lots of noisy information, most of which is not true. This causes people to distrust science in general, because they keep hearing conflicting information.

It is unlikely that the mainstream media will voluntarily forgo the reporting of sensationalistic news just because the information is preliminary and unreliable. It is too easy for them to convince themselves that including a bit of skepticism (or even well-balanced skepticism) is sufficient. While this is better than rank fear-mongering (which also happens) in the end the reporters still get their flashy headline and the public comes away with misconceptions.

While I will continue to advocate for higher standards of science news reporting (including using judgment in terms of what not to report), it seems this needs to be combined with educating the public about the nature of preliminary science research, the nature of observational vs experimental studies, and the need to filter all science news reporting through an informed skeptical filter.

Share

3 responses so far

3 Responses to “Reporting Preliminary Studies”

  1. Dianeon 14 Feb 2011 at 10:22 pm

    It’s also odd that they compared no soda to diet soda and said not a word about regular soda. That data must have been in their data set (because why would a survey ask about soda consumption and not include an option for regular soda?) and they must have looked at it (because why not?). The fact that they did not report anything suggests that they didn’t find anything, which makes the significant association they did find look even more like the chance result of data dredging.

  2. eiskrystalon 15 Feb 2011 at 3:59 am

    This would seem to suggest that scientists sticking up poor studies in public should know better.

  3. petrossaon 15 Feb 2011 at 11:30 am

    Reminds me of the red meat causes cancer scare story. However in that case it was science itself at fault, here it’s more bad journalism looking for a headline.

    But if one goes and compiles a list of like ‘studies’ you better create a wiki lemma, you’ll need the space.

    What strikes me though that from exact sciences one sees these studies much less frequent. Either they are real scientists or the ‘studies’ aren’t juicy enough.

    “Time shown to be a deciding factor in erosion!” Mm, no not juicy.

Trackback URI | Comments RSS

Leave a Reply

You must be logged in to post a comment.