Abstract

Like it or not, attempts to evaluate and monitor the quality of academic research have become increasingly prevalent worldwide. Performance reviews range from at the level of individuals, through research groups and departments, to entire universities. Many of these are informed by, or functions of, simple scientometric indicators and the results of such exercises impact onto careers, funding and prestige. However, there is sometimes a failure to appreciate that scientometrics are, at best, very blunt instruments and their incorrect usage can be misleading. Rather than accepting the rise and fall of individuals and institutions on the basis of such imprecise measures, calls have been made for indicators be regularly scrutinised and for improvements to the evidence base in this area. It is thus incumbent upon the scientific community, especially the physics, complexity-science and scientometrics communities, to scrutinise metric indicators. Here, we review recent attempts to do this and show that some metrics in widespread use cannot be used as reliable indicators research quality.

Highlights

  • The field of scientometrics can be traced back to the work of the physicist Derek de Solla Price [2] and the linguist/businessman Eugene Garfield [3]

  • The precise role of metrics at REF2021 is yet to be decided but indications are that peer review should remain the primary method of research assessment

  • Is there a better metric, perhaps? In [21], we demonstrated that the departmental h-index [10, 26] has a better correlation with the Research Assessment Exercise (RAE)-measured strength index s than has the normalised citation index (NCI), i

Read more

Summary

Introduction

The field of scientometrics can be traced back to the work of the physicist Derek de Solla Price [2] and the linguist/businessman Eugene Garfield [3]. The Research Excellence Framework (REF) was introduced as the successor to the RAE and was undertaken in 2014 to assess the research carried out between 2008 and 2013 This exercise was based on peer review by panels of subject experts. The three categories outputs, environment and impact were weighted at 65%, 15% and 20%, respectively Another change was that, while for the RAE research was categorised into 67 academic disciplines, in the 2014 REF there were only 36 units of assessment.

Should metrics be used in the research evaluation schemes?
The NCI
The departmental h-index
Findings
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call