Deep Impact: Unintended consequences of journal rank; is using journal rank as an assessment tool a bad scientific practice?

See on Scoop.itDual impact of research; towards the impactelligent university


Most researchers acknowledge an intrinsic hierarchy in the scholarly journals (‘journal rank’) that they submit their work to, and adjust not only their submission but also their reading strategies accordingly. On the other hand, much has been written about the negative effects of institutionalizing journal rank as an impact measure. So far, contributions to the debate concerning the limitations of journal rank as a scientific impact assessment tool have either lacked data, or relied on only a few studies. In this review, we present the most recent and pertinent data on the consequences of our current scholarly communication system with respect to various measures of scientific quality (such as utility/citations, methodological soundness, expert ratings or retractions). These data corroborate previous hypotheses: using journal rank as an assessment tool is bad scientific practice. Moreover, the data lead us to argue that any journal rank (not only the currently-favored Impact Factor) would have this negative impact. Therefore, we suggest that abandoning journals altogether, in favor of a library-based scholarly communication system, will ultimately be necessary. This new system will use modern information technology to vastly improve the filter, sort and discovery functions of the current journal system.


The authors:

“We would favor bringing scholarly communication back to the research institutions in an archival publication system in which both software, raw data and their text descriptions are archived and made accessible, after peer-review and with scientifically-tested metrics accruing reputation in a constantly improving reputation system (Eve, 2012). This
reputation system would be subjected to the same standards of scientific scrutiny as are commonly applied to all scientific matters and evolve to minimize gaming and maximize the alignment of researchers’ interests with those of science (which are currently misaligned Consequences of Journal Rank (Nosek et al., 2012)). Only an elaborate ecosystem of a multitude of metrics can provide the flexibility to capitalize on the small fraction of the multi-faceted scientific output that is actually quantifiable. Such an ecosystem would evolve such that the only evolutionary stable strategy is to try and do the best science one can.”



Citation: Brembs B, Button K and Munafò M (2013). Deep Impact: Unintended consequences of journal rank. Front. Hum. Neurosci. 7:291. doi: 10.3389/fnhum.2013.00291

Received: 25 Jan 2013; Accepted: 03 Jun 2013.

doi: 10.3389/fnhum.2013.00291


Keywords: Impact Factor, Journal Ranking, Statistics as Topic, misconduct, Fraud, Publishing, Open access, scholarly communication, Libraries, Library Services

See on


About Wilfred Mijnhardt
RMIMR is my virtual playground, a place to reflect on issues from my professional context, my job as Policy Director at Rotterdam School of Management, Erasmus University (RSM). RSM is the international university based business school at Erasmus University Rotterdam. More info here: Here is my list of relevant publications on the topic of my RMIMR weblog: The rss feed for my RMIMR collection is here: Here is my other weblog on impact of research:

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

RSS Business School News

  • An error has occurred; the feed is probably down. Try again later.
Locations of Site Visitors
%d bloggers like this: