Abstract

Despite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe—first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have defined wide-spread practices. The framework of this investigation is Abbott’s theory of professions, and I argue that indicator-based research assessment constitutes a potential jurisdiction for both individual experts and expert organizations. This investigation was conducted using a search methodology that yielded 138 evaluation studies from 21 EU countries, covering the period 2005 to 2019. Structured content analysis revealed the following findings: (1) Bibliometric research assessment is most frequently performed in the Nordic countries, the Netherlands, Italy, and the United Kingdom. (2) The Web of Science (WoS) is the dominant database used for public research assessment in Europe. (3) Expert organizations invest in the improvement of WoS citation data, and set technical standards with regards to data quality. (4) Citation impact is most frequently assessed with reference to international scientific fields. (5) The WoS classification of science fields retained its function as a de facto reference standard for research performance assessment. A detailed comparison of assessment practices between five dedicated organizations and other individual bibliometric experts suggests that corporate ownership and limited access to the most widely used citation databases have had a restraining effect on the development and diffusion of professional bibliometric methods during this period.

Highlights

  • Research organizations and research funding agencies have a growing demand for practicable methods of research evaluation, including metrics based on publication and citation data

  • In contrast to conventional meta-evaluations that assess how well a set of evaluation studies adheres to predefined methodological standards, the purpose of my meta-evaluation was to investigate whether professional de facto standards could be observed in bibliometric research assessment

  • What were the prevailing methods of bibliometric performance assessment in European evaluation practice during the period 2005–2019? Second, if methodological de facto standards existed, which actors were in the position to define them?

Read more

Summary

Introduction

Research organizations and research funding agencies have a growing demand for practicable methods of research evaluation, including metrics based on publication and citation data. I selected this theory to investigate how particular methodological choices become socially established as professionally legitimate means of handling certain evaluation problems This framework is used to address the issue of professional control in bibliometric assessment. One recent paper presents an empirical investigation of whether the academic research area of ‘evaluative citation analysis’ has successfully defined scientific standards for bibliometric research evaluation during the period 1972–2016 [19]. Based on organizational network analysis and the theory of intellectual fields as reputational organizations, [19] concluded that the field of evaluative citation analysis has been characterized by low levels of reputational control, evidenced by high shares of outsider contributions and new actors entering the field throughout the examined period They argued that this lack of reputational control within the academic research area is consistent with observed difficulties in establishing scientific authority for bibliometric assessment practice. I present empirical findings and discuss results in light of the theoretical framework

Theoretical considerations
Data and methods
Results
Funding instrument
Funding instrument total
Funding instruments total
Discussion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.