Abstract

Abstract This article investigates the impact of the second national research assessment (VQR 2004–10), which was performed in 2011 by the Italian National Agency for the Evaluation of Universities and Research Institutes, on publication strategies by sociologists in Italy. We reconstructed all publications from Italian sociologists in Scopus between 2006 and 2015, that is five years before and after the assessment. We also checked academic tenure and promotions during the assessment. Our results showed the potentially distortive effect of institutional signals on publications given that Italian sociologists published more in journals that were considered influential for assessment, some, however, being of doubtful quality. Our findings would suggest that the use of informed peer review and ad hoc journal ranking could stimulate adaptive responses based on strategic journal targeting to ensure publication.

Highlights

  • The dominant worldwide ‘publish or perish’ culture in academia is often associated with regular and pervasive quantitative research assessments of scientific outputs in terms of publications

  • It is worth noting that here we only focused on the most productive sample of Italian sociologists since those without a single publication indexed in Scopus were excluded from our dataset

  • Unlike previous studies, which suggested that the internationalization of scientific collaborations has recently increased (Akbaritabar et al 2018; Leydesdorff et al 2014), we found that the internationalization of this most productive subset of Italian sociologists decreased after ANVUR

Read more

Summary

Introduction

The dominant worldwide ‘publish or perish’ culture in academia is often associated with regular and pervasive quantitative research assessments of scientific outputs in terms of publications. In a thorough analysis of performancebased research funding systems in EU28 member countries, Zacharewicz et al (2019) found important differences in terms of evaluation objectives (e.g. only teaching activities, or a combination of research and teaching) and the temporal dimension (e.g. ex ante or ex post assessment) They suggested that any institutional design of evaluation needs to consider contexts and specificities at the expense of generalization and comparison. Sivertsen (2016) discussed the ‘Norwegian model’, followed by various countries, such as Belgium (Flanders), Denmark, and Norway This model has three main components: (1) a national database of all publications; (2) a single publication indicator with weights to accommodate certain field specificities and publishing traditions; and (3) a performance-based funding model that allocates resources based on the weight of institutions on the total publications at the country level. As suggested by Aagaard and Schneider (2016), while concentrating on the specific case of Denmark, the evolving contexts of evaluation and the multiplicity of policies would not help reflection on the fundamental drivers of scientific performance of either institutes or academics

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call