Abstract

The performative co-construction of academic life through myriad metrics is now a global phenomenon as indicated by the plethora of university research or journal ranking systems and the publication of ‘league’ tables based on them. If these metrics are seen as actively constituting the social world, can an analysis of this ‘naturally occurring’ data reveal how these new technologies of value and measure are recursively defining the practices and subjects of university life? In the UK higher education sector, the otherwise mundane realities of academic life have come to be recursively lived through a succession of research assessment exercises (RAEs). Lived through not only in the RAEs themselves, but also through the managed incremental changes to the academic and organizational practices linked to the institutional imaginings of planning for, and anticipating the consequences of, the actual exercises. In the ‘planning for’ mode an increasing proportion of formerly sociology submissions have shifted into ‘social policy’. This is one instance of how institutional ‘game-playing’ in relation to the RAE enacts the social in quite fundamental ways. Planning an RAE 2008 submission in Sociology required anticipation of how a panel of 16 peers would evaluate 39 institutions by weighted, relative worth of: aggregated data from 1,267 individuals who, between them cited a total of 3,729 ‘outputs’; the detailed narrative and statistical data on the research environment; and a narrative account of academic ‘esteem’. This data provided such institutional variables as postgraduate student numbers, sources of student funding, and research income from various sources. To evaluate the ‘quality’ of outputs various measures of the ‘impact’ and/or ‘influence’ of journals, as developed from the Thomson-Reuters Journal Citation Reports, was linked to the data. An exploratory modelling exercise using these variables to predict RAE 2008 revealed that despite what we might like to think about the subtle nuances involved in peer review judgements, it turns out that a fairly astonishing 83 per cent of the variance in outcomes can be predicted by some fairly simple ‘shadow metrics’: quality of journals in the submission, research income per capita and scale of research activity. We conclude that measuring the value of sociology involves multiple mutual constructions of reality within which ever more nuanced data assemblages are increasingly implicated and that analysis of this data can make explicit some of the parameters of enactment within which we operate in the contemporary academy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call