Abstract

Evaluation of scientific research is becoming increasingly reliant on publication-based bibliometric indicators, which may result in the devaluation of other scientific activities - such as data curation – that do not necessarily result in the production of scientific publications. This issue may undermine the movement to openly share and cite data sets in scientific publications because researchers are unlikely to devote the effort necessary to curate their research data if they are unlikely to receive credit for doing so. This analysis attempts to demonstrate the bibliometric impact of properly curated and openly accessible data sets by attempting to generate citation counts for three data sets archived at the National Oceanographic Data Center. My findings suggest that all three data sets are highly cited, with estimated citation counts in most cases higher than 99% of all the journal articles published in Oceanography during the same years. I also find that methods of citing and referring to these data sets in scientific publications are highly inconsistent, despite the fact that a formal citation format is suggested for each data set. These findings have important implications for developing a data citation format, encouraging researchers to properly curate their research data, and evaluating the bibliometric impact of individuals and institutions.

Highlights

  • In recent years there has been increasing interest in, and use of, bibliometric indicators for the evaluation and ranking of research institutions

  • My phase 1 results suggest that, if they were counted as journal articles in WoS, both the World Ocean Atlas and World Ocean Database (WOA/WOD) and the Pathfinder Sea Surface Temperature (PSST) data sets would have citation counts higher than 99% of all articles in Oceanography in WoS from any single publication year from 1995 to the present

  • I attempted to generate citation counts for three oceanographic data sets curated by National Oceanographic Data Center (NODC) by searching WoS, publishers’ websites, and Google Scholar for mentions of these data sets in the bibliographic information or full text of scientific articles

Read more

Summary

Introduction

In recent years there has been increasing interest in, and use of, bibliometric indicators for the evaluation and ranking of research institutions. Bibliometric indicators feature prominently in global mixed-method ranking schemes such as the Academic Ranking of World Universities [1] and the Times Higher Education ranking [2]. They feature in national mixed-method research assessment exercises in the UK, Brussels, Italy, and Australia. Reasons for developing a citation format for data sets include verification of published results, reuse of data sets for additional research purposes, and attribution to data collectors and archivists. Such suggestions have been made in bioinformatics [13,14,15,16], genetics [17], climate sciences [18], geochemistry [19,20,21], oceanography [22,23], earth sciences [24,25], and multidisciplinary sciences [26,27,28], among others

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call