May 29 2008

Bad Medical Journalism

Published by under Uncategorized
Comments: 10

I often rant against the generally poor quality of science journalism. There are a few bright spots every now and then (like the recent article on vaccines in Time magazine) but overall reading science news in the mainstream media is a depressing and frustrating affair. At least it gives me material for this blog, which then serves as a means for (albeit modest) damage control.

Also, being the skeptic, I have to question my own observations. Perhaps I am falling victim to confirmation bias – noticing the bad science journalism that confirms my biases and missing good science journalism or dismissing it as “the exception.” I assume that this is happening to some degree, and so try to keep open-minded about the media. Of course some objective systematic information would help me assess the accuracy of my subjective perceptions.

Well, now I have some.

Gary Schwitzer from the University of Minnesota School of Journalism and Mass Communication runs a website called, the purpose of which is to rate medical news stories in the mainstream media. He has now published an analysis of 500 medical news stories over the past two years. He found that:

In our evaluation of 500 US health news stories over 22 months, between 62%–77% of stories failed to adequately address costs, harms, benefits, the quality of the evidence, and the existence of other options when covering health care products and procedures. This high rate of inadequate reporting raises important questions about the quality of the information US consumers receive from the news media on these health news topics.

About two-thirds of all medical news stories (and these are from the top news outlets) had major flaws. In my opinion this establishes that there is a systematic problem with the quality of medical news journalism in the US. Other countries have similar problems, and I suspect that similarly poor quality is a problem across science news reporting and not isolated to medical news.

Here are the criteria that Schwitzer uses to assess news stories (I just give the categories here, you can read the full description on his website here.

Criterion #1 The availability of the treatment/test/product/procedure

Criterion #2 Whether/how costs are mentioned in the story

Criterion #3 If there is evidence of disease mongering in the story

Criterion #4 Does the story seem to grasp the quality of the evidence?

Criterion #5 How harms of the treatment/test/product/procedure are covered in the story

Criterion #6 Does the story establish the true novelty of the approach?

Criterion #7 How the benefits of the treatment/test/product/procedure are framed

Criterion #8 Whether the story appeared to rely solely or largely on a news release

Criterion # 9 Is there an independent source and were any possible conflicts of interests of sources disclosed in the article?

Criterion #10 Whether alternative treatment/test/product/procedure options are mentioned

He compiles the results of this analysis into a 1-5 star rating scale. Reading through a number of his analyses of specific articles, he seems to do a thorough job and his criteria work well. I do have some constructive criticism, however. He has apparently fallen into the “evidence-based medicine” trap of forgetting to adequately consider prior probability or scientific plausibility. For example, this review of an an article discussing a recent acupuncture study does criticize the article for not putting the data into more of a context (which is good) but his discussion of context does not specifically address plausibility. My reading of this study comes to a different bottom line conclusion from the article being reviewed, and I would have rated the article lower for failing to put the study into proper scientific context.

In fact, I would add evaluation of scientific context/plausibility as a separate criterion worthy of its own rating and discussion.

Schwitzer also discusses what he thinks needs to be done to improve the situation – which is great because it is always a good idea to not just complain but to also offer constructive advice for improvement. He writes:

– Reporters and writers have been receptive to the feedback; editors and managers must be reached if change is to occur.
– Time (to research stories), space (in publications and broadcasts), and training of journalists can provide solutions to many of the journalistic shortcomings identified by the project.

I completely agree. Training is perhaps the most important thing – and right now we seem to be going in the opposite direction, toward journalists with less scientific/medical training.

My complements to Gary Schwitzer for identifying the problem, his efforts at Health News Review, and for bringing this problem to public attention. We definitely have common cause when it comes to the quality of science reporting. But if current trends continue, I should have no worries in the future about finding topics for this blog.

10 responses so far

10 thoughts on “Bad Medical Journalism”

  1. Blair T says:


    I wonder if you think that science and medicine reporting is different from reporting of other news stories. Is it that medical news stories are treated differently than other news, or that they require more care in reporting than other news?

    As an aside – I have found that the Economist magazine has the most consistently good reporting around. Highly recommended.

  2. Blake Stacey says:

    Now, we just need to find somebody who’s willing to pay for a similar survey about non-medical science journalism!

  3. I think science journalism is very challenging and requires specialized knowledge. The problem appears to be primarily with generalists thinking they can tackle science news stories when they can’t (or being forced to by editors). Sometimes science news stories are treated as fluff.

    One classic mistake that I find very common is that naive reporters confuse the authority of a single scientist with the authority of the scientific community.

  4. Very interesting article! Sloppy journalism is found in all media beats lately (I haven’t been called by a fact-checker since 1999!) But, in medical and science news, it can be life-threatening!

    I blogged about this from the PR perspective, with return links back to NeuroLogica.

    In fact, as you’ll see in my post, I have felt for quite a while that bloggers are going to become more associated with “old school” journalism than major market media.

    Thanks for a great read!
    Jennifer A. Jones (SpeakMediaBlog)

  5. Sastra says:

    Criterion #10 Whether alternative treatment/test/product/procedure options are mentioned

    You know, this can be read two different ways. I see from looking at the links (and noting your lack of reaction) that my initial interpretation was off.

  6. daedalus2u says:

    The most irritating thing for me in health and science articles in the general press is the lack of a link back to the actual journal article. Then I have to go hunting around to find it.

    If the journalist has spent the time to talk to the authors and obviously (I think?) has at least looked at the paper, to not have a link to it is not something I understand.

  7. superdave says:

    I wonder if they will single out individual writers.

  8. Yes, it is clear that “alternative options” means just that, not CAM.

Leave a Reply