Oct 07 2013

A Problem with Open Access Journals

In a way the internet is a grand bargain, although one that simply emerged without a conscious decision on the part of anyone. It greatly increases communication, lowers the bar for content creation and distribution, and allows open access to vast and deep databases of information. On the other hand, the traditional barriers of quality control are reduced or even eliminated, leading to a “wild west” of information. As a result it is already a cliche to characterize our current times as the “age of misinformation.”

For someone like me, a social-media skeptic, I feel the cut of both edges quite deeply. With podcasts, blogs, YouTube videos, and other content, I can create a network of content creation and distribution that can compete with any big media outlet. I can use these outlets to correct misinformation, analyse claims, engage in debates, and debunk fraud and myths.

On the other hand, the fraud, myths, and misinformation are multiplying at frightening rates on the very same platforms. It is difficult to gauge the net effect – perhaps that’s a topic for another post.

For this post I will discuss one of the most disturbing trends emerging from the internet phenomenon – the proliferation of poor quality science journals, specifically open access journals. ¬†The extent of this problem was recently highlighted by a “sting” operation recently published by Science magazine.

According to the Directory of Open Access Journals (DOAJ):

We define open access journals as journals that use a funding model that does not charge readers or their institutions for access. From the BOAI definition¬†of “open access”, we support the rights of users to “read, download, copy, distribute, print, search, or link to the full texts of these articles” as mandatory for a journal to be included in the directory.

This is great, and open access has many supporters, including me. But all new “funding models” have the potential of creating perverse incentives. With the traditional model of print publishing, money was made through advertising and subscription fees. Subscriptions are driven by quality and impact factor, creating an incentive for high quality peer review and overall quality.

Open access journals frequently make their money by charging a publication fee of the author. This creates an incentive to publish a lot of papers of any quality. In fact, if you could create a shell of a journal, with little staff, and publish many papers online with little cost, that could generate a nice revenue stream. Why not create hundreds of such journals, covering every niche scientific and academic area?

This, of course, is what has happened. We are still in the middle of the explosion of open access journals. At their worst they have been dubbed “predatory” journals for charging hidden fees, exploiting naive academics, and essentially being scams.

John Bohannon decided to run a sting operation to test the peer-review quality of open access journals. I encourage you to read his entire report, but here’s the summary.

He identified 304 open access journals that publish in English. He created a fake scientific paper with blatant fatal flaws that rendered the research uninterpretable and the paper unpublishable. He actually created 304 versions of this paper by simply inserting different variables into the same text, but keeping the science and the data the same. He then submitted a version of the paper to all 304 journals under different fake names from different fake universities (using African names to make it seem plausible that they were obscure).

The result? – over half of the papers were accepted for publication. I think it’s fair to say that any journal that accepted such a paper for publication is fatally flawed and should be considered a bogus journal.

This, of course, is a huge problem. Such journals allow for the flooding of the peer-reviewed literature with poor quality papers that should never be published. This is happening at a time when academia itself is being infiltrated with “alternative” proponents and post-modernist concepts that are anathema to objective standards.

Combine this with the erosion of quality control in science journalism, also thanks to the internet. Much of what passes as science reporting is simply cutting and pasting press releases from journals, including poor-quality open access journals hoping for a little free advertising.

At least this creates plenty of work to keep skeptics busy.

What this means for everyone is that you should be highly wary of any published study, especially if it comes from an obscure journal. The problem highlighted with this sting is not unique to open-access journals. There are plenty of “throw-away” print journals as well. And even high impact print journals may be seduced into publishing a sexy article with dubious research. Michael Eisen reminds us about the aresenic DNA paper that Science itself published a few years ago.

Definitely you should look closely at the journal in which a paper is published. But also, do not accept the findings of any single paper. Reliable scientific results only emerge following replication and the building of consensus.

Perhaps the Science paper will serve as a sort-of Flexner report for open access journals. In 1910 the Flexner report exposed highly variable quality among US medical schools, resulting in more than half of them shutting down, and much tighter quality control on those that remained open. The Flexner report is often credited with bringing US medical education into the scientific era.

In order to tame the wild west, we need clearing houses that provide careful review and their stamp of approval for quality control. The DOAJ tries to do this, stating:

For a journal to be included it should exercise quality control on submitted papers through an editor, editorial board and/or a peer-review system.

Clearly such review needs to be more robust. The integrity of the published literature is a vital resource of human civilization. As we learn to deal with the consequences of open access, intended and unintended, we need to develop new institutions of quality control and science-based standards.

 

Like this post? Share it!

13 responses so far