Nine Criteria for a Measure of Scientific Output
Open Access
- 1 January 2011
- journal article
- review article
- Published by Frontiers Media SA in Frontiers in Computational Neuroscience
- Vol. 5, 12520
- https://doi.org/10.3389/fncom.2011.00048
Abstract
Scientific research produces new knowledge, technologies and clinical treatments that can lead to enormous returns. Often, the path from basic research to new paradigms and direct impact on society takes time. Precise quantification of scientific output in the short-term is not an easy task but is critical for evaluating scientists, laboratories, departments and institutions. We argue, with others, that current methods are not ideal and suffer from solvable difficulties. Here we propose criteria for a metric to be considered a good index of scientific output. Specifically, we argue that such an index should be quantitative, based on robust data, rapidly updated and retrospective, presented with confidence intervals, normalized by number of contributors, career stage and discipline, impractical to manipulate, and focused on quality over quantity. It should be validated through computational and empirical testing. Given its influence on the efficiency of scientific research, we have a duty to reflect upon and implement novel and rigorous ways of evaluating scientific output. The criteria proposed here provide initial steps towards the systematic development and validation of a metric to evaluate scientific output.This publication has 30 references indexed in Scilit:
- Metrics: journal's impact factor skewed by a single paperNature, 2010
- A Principal Component Analysis of 39 Scientific Impact MeasuresPLOS ONE, 2009
- The most influential journals: Impact Factor and EigenfactorProceedings of the National Academy of Sciences, 2009
- Are You Ready to Become a Number?Science, 2009
- Reviewing Peer ReviewScience, 2008
- Time for the single researcher impact factorBMJ, 2008
- The Impact Factor GamePLoS Medicine, 2006
- The History and Meaning of the Journal Impact FactorJAMA, 2006
- Impact Factor as We Know it Handicaps Neuropsychology and NeuropsychologistsCortex, 2006
- Tests of methods for evaluating bibliographic databases: An analysis of the National Library of Medicine's handling of literatures in the medical behavioral sciencesJournal of the American Society for Information Science, 1986