Abstract

Watch the VIDEO here.New forms of evaluation are reconfiguring science in ways we are only beginning to understand. In this talk, I will address a key challenge in social-scientific research regarding how evaluations are implicated in scientific understandings of the world. I will address how public sector transformations (including a stronger emphasis on national economic goals such as innovation and growth) are introducing highly particular notions of ‘good’ performance and uses of evaluative metrics. Using empirical material and results from recent projects, the talk will reflect on how these trends are affecting knowledge production processes in different fields. I will also suggest ways to think more creatively and responsibly about the affordances of evaluation and indicators in academic settings.

Highlights

  • Both “quality/excellence” and “impact” have become crucial for success at all levels of the scientific system

  • “Nobody’s going to give you a grant if you have four papers in an impact factor 1 journal, but you may get a grant based on a paper that you published in an impact factor 12 journal or higher, right? And so at that time, we said, ‘‘We have to change the requirement for getting the PhD,’’ and we set that bar at 15 impact points

  • - aimed at both researchers and evaluators - development of evidence based arguments - expanded list of research output - establishing provenance - taxonomy of indicators: bibliometric, webometric, altmetric - guidance on use of indicators - contextual considerations, such as: stage of career, discipline, and country of residence

Read more

Summary

Some initial observations

Research has become a strategic enterprise in which permanent communication is crucial. The relative professional autonomy of science and scholarship has weakened considerably. Both “quality/excellence” and “impact” have become crucial for success at all levels of the scientific system. Peer & expert review and indicator based assessment have become intimately intertwined and mutually shape each other

The Evaluation Gap
Some conceptual problems with JIF
Thinking with Indicators in life sciences
Grading for novelty and quality
Space of problems Space of research
The Leiden Manifesto
Career Narrative
Evaluation Guidelines
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call