Abstract
The Social Sciences have long been struggling with quantitative forms of research assessment – insufficient coverage in prominent citation indices and overall lower citation counts than in STM subject areas have led to a widespread weariness regarding bibliometric evaluations among social scientists. Fueled by the rise of the social web, new hope is often placed on alternative metrics that measure the attention scholarly publications receive online, in particular on social media. But almost a decade after the coining of the term altmetrics for this new group of indicators, the uptake of the concept in the Social Sciences still seems to be low. Just like with traditional bibliometric indicators, one central problem hindering the applicability of altmetrics for the Social Sciences is the low coverage of social science publications on the respective data sources – which in the case of altmetrics are the various social media platforms on which interactions with scientific outputs can be measured. Another reason is that social scientists have strong opinions about the usefulness of metrics for research evaluation which may hinder broad acceptance of altmetrics too. We conducted qualitative interviews and online surveys with researchers to identify the concerns which inhibit the use of social media and the utilization of metrics for research evaluation in the Social Sciences. By analyzing the response data from the interviews in conjunction with the response data from the surveys, we identify the key concerns that inhibit social scientists from (1) applying social media for professional purposes and (2) making use of the wide array of metrics available. Our findings show that aspects of time consumption, privacy, dealing with information overload, and prevalent styles of communication are predominant concerns inhibiting Social Science researchers from using social media platforms for their work. Regarding indicators for research impact we identify a widespread lack of knowledge about existing metrics, their methodologies and meanings as a major hindrance for their uptake through social scientists. The results have implications for future developments of scholarly online tools and show that researchers could benefit considerably from additional formal training regarding the correct application and interpretation of metrics.
Highlights
The first to introduce the idea of evaluating the importance of scientific work based on quantitative metrics— citation counts—were Gross and Gross (1927) in 1927 (Bornmann and Daniel, 2008)
We interviewed 9 researchers face-to-face in groups about their work-related usage of social media and their notions on metrics used for research evaluation
For our study we are primarily interested in the perceptions of all kinds of social scientists, we considered only responses from researchers identifying themselves in the survey as primarily working in either Social Sciences, Political Sciences, Sociology, Psychology, Demography, Human Geography, Economics or Business Studies
Summary
The first to introduce the idea of evaluating the importance of scientific work based on quantitative metrics— citation counts—were Gross and Gross (1927) in 1927 (Bornmann and Daniel, 2008). Because scientific publications are to an increasing extent accessed as electronic documents online, the providers of publication outlets hosting those documents can without difficulty record and display the attention that individual publications receive as usage metrics, i.e., as download- or page view counts. Another prevalent family of web-based metrics is called altmetrics, a term coined by Priem et al (2010) to comprise various signals of the buzz scholarly products receive on social media. Altmetrics have been shown to circumvent several weaknesses of citations as indicators for scientific attention (Wouters and Costas, 2012): they can be collected for a large variety of scientific products, e.g., for software, presentation slides, posters, individual book chapters, et cetera; altmetrics are available much faster than citation counts as the major part of altmetric resonance toward a publication happens very shortly after its publication [see (Thelwall et al, 2013)]; they show a broader spectrum of scientific impact than citations, as they are able to reflect resonance among non-scientific audiences; most altmetrics are based on publicly available APIs which are open and free to use, unlike the commercial databases commonly used for citation analyses
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.