Abstract

The use of quantitative metrics to gauge the impact of scholarly publications, authors, and disciplines is predicated on the availability of reliable usage and annotation data. Citation and download counts are widely available from digital libraries. However, current annotation systems rely on proprietary labels, refer to journals but not articles or authors, and are manually curated. To address these limitations, we propose a social framework based on crowdsourced annotations of scholars, designed to keep up with the rapidly evolving disciplinary and interdisciplinary landscape. We describe a system called Scholarometer, which provides a service to scholars by computing citation-based impact measures. This creates an incentive for users to provide disciplinary annotations of authors, which in turn can be used to compute disciplinary metrics. We first present the system architecture and several heuristics to deal with noisy bibliographic and annotation data. We report on data sharing and interactive visualization services enabled by Scholarometer. Usage statistics, illustrating the data collected and shared through the framework, suggest that the proposed crowdsourcing approach can be successful. Secondly, we illustrate how the disciplinary bibliometric indicators elicited by Scholarometer allow us to implement for the first time a universal impact measure proposed in the literature. Our evaluation suggests that this metric provides an effective means for comparing scholarly impact across disciplinary boundaries.

Highlights

  • Many disciplinary communities have sought to address the need to organize, categorize, and retrieve the articles that populate their respective online libraries and repositories

  • An initial step towards a solution comes in the form of journal indices, such as those supported by Thomson-Reuters as part of their Journal Citation Reports (JCR) and Web of Science (WoS) commercial products

  • As disciplines evolve through novel discoveries and interdisciplinary collaborations, semantic predicates associated with these ontologies may become increasingly vague and less informative and will fail to identify the interdisciplinary work occurring at the granularity level of articles and the new areas that emerge at the disciplinary boundaries

Read more

Summary

Introduction

Rather than attempting to match new scientific production to predefined categories, it would be useful to facilitate semantic evolution by empowering scholars to annotate each other’s work This bottom-up approach has already been adopted in popular systems such as Bibsonomy.org [1], Mendeley [2], and many others [3,4]. The extraction of bibliographic information from online repositories is not new Bibliographic management tools such as BibDesk offer robust search of online resources and digital libraries like PubMed [14]; users can import objects into Connotea using Digital Object Identifiers (DOI) [3]; and Zotero can spider through and collect bibliographic information from webpages [15]. We emphasize the use of scholarly disciplines in the annotations, as discussed in the sectionUser Iterface

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.