Abstract

Self-archiving in Institutional Repositories (IRs) is playing a central role in the success of the Open Access initiatives. Deposited documents are more visible and probably they get more downloads and citations, but making them freely available in a local repository is not enough. Social tools, both public and academic targeting, networking or silo oriented, should be taken into account for reaching larger audiences and increase not only the scholarly but also the social impact. The paper explores the presence of IRs contents in 28 social tools (Academia, Bibsonomy, CiteUlike, CrossRef, Datadryad, Facebook, Figshare, Google+, GitHub, Instagram, LinkedIn, Pinterest, Reddit, RenRen, ResearchGate, Scribd, SlideShare, Tumblr, Twitter, Vimeo, VKontakte, Weibo, Wikipedia All Languages, Wikipedia English, Wikia, Wikimedia, YouTube and Zenodo) using a webometric approach. We collected the link mentions of 2185 IRs in the cited tools during July 2017 from Google selected data centers. The results show that most of the IRs have no strong presence in the most specializes tools and even for the most popular services the figures are not high enough too. A candidate explanation for the low number of altmetric mentions is the lack of strategy in promotion of IRs contents and certain bad practices mostly regarding URL naming.

Highlights

  • Since mid-nineties webometrics is slowly placing a role in the description and evaluation of the scholarly communication (Thelwall, Vaughan & Björneborn, 2005; Orduña-Malea & Aguillo, 2014),)

  • The population consists of 2185 Institutional Repositories (IRs) from which their web address converted into strings of characters are checked for being mentioned in the cited social tools according to Google

  • Considering that the number of items deposited in the global IRs is in the order of several millions and that even considering overlaps and duplicates, the number of link mentions for all of the social tools is low, even for the academic networks (RG and Academia with averages below 300 mentions)

Read more

Summary

Introduction

Since mid-nineties webometrics is slowly placing a role in the description and evaluation of the scholarly communication (Thelwall, Vaughan & Björneborn, 2005; Orduña-Malea & Aguillo, 2014),). The lack of reliable data sources for link analysis is still one of the main barriers for its fully acceptance by the metric community (Thelwall, 2010). Links are not the only web indicators that can be used for measuring science impact and several alternatives has been proposed like mentions of the names of institutions/authors or of paper/monographs titles (Cronin et al, 1998; Kretschmer & Aguillo, 2004, 2005). The emergence of social tools in recent years provides new opportunities for metric analysis of the scholarly impact. The most successfully proposal for profiting from that opportunity, STI Conference 2018 · Leiden the Jason Priem ‘altmetrics’ (Priem et al, 2010) that stands for alternative metrics, consists of a large and very heterogeneous set of measures with a very unfortunate umbrella term

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call