Scientific peer review as it really is

Sent to me by a colleague, this is a true depiction of the process of peer review as experienced by many in the LIS community. One is tempted to name names but you can easily do that for yourself.

Of course the seriousness of peer review and the pitfalls in many implementations of it should not be underestimated. Some of the ‘reviews’ I have received over the years from so-called top journals would give any impartial observer pause. I don’t mean rejections, I mean one-liner acceptances, admitted reviewer ignorance of methods, and in one case, a paper came back with a line from a reviewer that said, basically, “I have no idea how these statistical tests work but the conclusions seem justifiable”. Is it any wonder there is public cynicism of science? Reviewing takes real effort though the process is largely thankless. The automated systems used by many publishers mean that as soon as you submit a review, you open yourself up to further requests for reviews, thereby creating the perfect disincentive to productivity. It’s become fashionable to question the value of double-blind peer review but one cannot divorce this discussion from a more systematic analysis of the whole process.

And as I write this, I received results of a survey from Elsevier of authors’ perceptions of what is important in submitting papers for review. Most important: speed of review. Next: quality of review. The trade-off continues.

One Reply to “Scientific peer review as it really is”

  1. 1. Thanks for putting up the video — not something we can do on a corporate blog!

    2. The topic of review — for journals and for conferences — is something that seems ripe for reassessment. One way that the system might be improved is to provide incentives for reviewers to write quality reviews, as I suggest in my post Reviewing the reviewers.

Leave a Reply

Your email address will not be published. Required fields are marked *