I remain indebted to peer review. Sure, I’ve been called a dilettante. Had ideas dismissed as half-baked. Had the floor swept with the derivative nature of my work. Been chastised for treating data as singular. And then the self-inflicted wounds of my own careless error. But having suffered from what appears only at first glance to be the slings and arrows of outrageous peer-review, I stand by this process.
I will defend a career’s worth of the anonymous and thankless work of reviewers who have provided the concerted kind of attention that I undoubtedly needed. It has made me, such as I am, a scholar of the peer-reviewed article and book. If I carry each kick, it aids my own revising, as I vow never again will they catch me out, okay, as it later turns out, maybe, once more. But then, no more.
Now, some think peer-review antiquated gate-keeping, arbitrarily or randomly assigned to two or more (overeager) gatekeepers. Some want to drag out of the closet, to thrust into the daylight of the crowd-sourced review, making it open and public.
Peer review was put to the test recently, when John Bohannon conducted a self-described “sting” operation targeting a set of 300 open access journals, including a good number of those already suspected of being in it only for the money. The sting caught out somewhat more than half that failed to conduct a review or at least no review that detected the fatal flaws built into the submitted sting paper. I don’t want to rehash the argument or the wave of critiques it provoked – Peter Suber’s review being among the best I encountered – in its own set of excellent post-factor and open set of peer reviews.
Instead, I want to say, as an advocate of open access and one who has worked for a decade-and-a-half on open source journal publishing software that thousands of journals use, some unscrupulously, I want every research paper to have the benefit of access to the literature and the privilege of being carefully reviewed. Let the right to know be met by the right to be read seriously.
To that end, I’d like someone to come up with a better idea for guiding researchers to open access journals that provide helpful, substantive reviews. That is, I don’t think the best idea is running a self-serving sting operation run by a subscription-journal intended to expose the unscrupulous and call into a question people’s right to knowledge through this model known as open access (Bohannon’s tag line – “A spoof paper concocted by Science reveals little or no scrutiny at many open-access journals”).
So as an incentive for coming up with a better method of driving out the bad, and protecting colleagues who may be new to publishing and working in institutions where it is hard to garner advice beyond unhelpful publish-in-the-top-journals, I offer an idea of my own, ready to be bested.
Someone, perhaps the Directory of Open Access Journals, or Jeffery Beall, or myself, should set up a virtual space for authors to affirm, from first hand-experience, a journals peer-reviewing policies. This would be a site where authors can identify journals and publishers that have, on their submitting an article, provided them with no evidence that a substantive peer review was conducted, at which point they withdrew it (as advised by this site that will provide email-templates for (a) asking for proof – show me the reviews – and (b) withdrawing an article, with a warning against posting it). Perhaps, authors will be able to include information on the academic credentials of the editors and editorial boards, the transparency of the pricing, and other matters. Checks will need to be put into place to authenticate the researchers and the submission of the article. And there needs to be an appeal process for the journal.
This would provide an incentive for editors to have better reviews, and better processes for sharing those reviews with authors. It will provide an incentive to authors not to go with an easy review-less acceptance from a journal (knowing others will identify it as review-less). Peer review is not simply about gatekeeping nor is it ever a guarantee of quality. It is, however, the very best method we have for improving the quality of the articles by which we live and learn.