Abstract

Three months ago, the open-access journal eLife published biophysicist Matthew Ferguson’s paper on the kinetics of RNA synthesis. The day it appeared, Ferguson added the paper’s web link to his profiles on Facebook, LinkedIn, and the academic social network ResearchGate. An assistant professor at Boise State University in Idaho, Ferguson says he’s been posting his papers to such sites since 2011, when he was a postdoc. “My hope is that they will generate more views, more citations, more downloads, more collaborations.”Journal citations, which figure heavily in hiring, tenure, and grant-funding decisions, have traditionally been the most valued measure of an article’s impact. But as scientists increasingly discover and share the fruits of their research online, alternative assessment metrics, or altmetrics, have been developed to quantify views, downloads, and social media mentions. Commonly tracked altmetrics include the number of tweets and retweets on Twitter, likes on Facebook, and bookmarks in Mendeley, a social scholarly reference library. Similar metrics are also being tracked for other electronically accessible research outputs, such as books, book chapters, patents, policy documents, datasets, figures, audio and video files, and computer code.In 2010 four information scientists posted the “altmetrics manifesto” (http://www.altmetrics.org/manifesto). It introduced the term “altmetrics” and faulted citation-based indicators, such as the h-index and the journal impact factor, for being slow, narrow, or lacking in context. An article, for example, can be viewed by thousands and generate several tweets within days, but it may not be cited in another journal article for several years. The manifesto also argued that altmetrics would “track impact outside the academy, impact of influential but uncited work, and impact from sources that aren’t peer reviewed.”Efforts are under way to formalize the burgeoning bibliometric practice, despite concerns about how to interpret the data. This spring, for instance, the Higher Education Funding Council for England (HEFCE) will unveil recommendations from its yearlong review of the general role of metrics in research assessment and management. Among the 153 individuals and organizations responding to HEFCE’s request for input, 15 UK higher education institutions singled out altmetrics as a potential research assessment tool, but 8 others argued that online metrics are “not reliable enough to be used as a measure of research quality.”Aggregating social impactSection:ChooseTop of pageABSTRACTAggregating social impact <<An h-index based on tweet...Unmapped territoryCITING ARTICLESLoosely defined, altmetrics measure any online activity occurring around scientific papers and other research output. That can include HTML views, PDF downloads, and citation counts extracted from online databases such as Scopus and CrossRef. A narrower definition limits altmetrics to social media activity: mentions and shares in blogs and on social networks like Twitter and Reddit, or saves in social bookmarking sites like CiteULike and Delicious. For nonarticle research output, statistics are gathered from repositories such as FigShare and the coding platform GitHub.As diverse as the internet itself are the types of research output—and the activity they generate—that can be measured by altmetrics platforms. This illustration reflects the categories tracked by altmetrics aggregator Plum Analytics, which also tracks citation counts from online databases. (Adapted from Plum Analytics.) PPT|High resolutionThree startups have emerged as the leading aggregators of altmetric data. London-based Altmetric LLP grew out of software that won company founder Euan Adie the $15 000 grand prize in the 2011 Elsevier Apps for Science competition. In 2013 the startup also received an undisclosed investment from Digital Science, a branch of publishing giant MacMillan.Altmetric provides a free online tool for individuals and sells subscriptions to publishers and libraries and to research groups and institutions that want to track and analyze their own altmetric activity. Besides displaying the raw data, Altmetric assigns a weighted score to individual research articles: The score assigns the highest values to mentions of an article in news outlets and the lowest to mentions on social networks. The 2014 physical science paper with the highest Altmetric score was Stephen Hawking’s “Information Preservation and Weather Forecasting for Black Holes,” which was posted on the preprint repository arXiv.org. So far it has been picked up by 70 mainstream news outlets, mentioned by 33 blogs, and tweeted 935 times.Wiley, Elsevier, the National Academy of Sciences, and the UK-based Institute of Physics are among publishers that now display the Altmetric score on some or all of their journal websites. The American Physical Society plans to add Altmetric’s data to its online articles early this year, says Mark Doyle, director of APS’s Journal Information Systems. “Alternative metrics provide a new window on the reach and impact—if not the importance—of individual articles that we believe will be useful to our authors and readers,” he says. The American Institute of Physics, which publishes Physics Today, is also considering the implementation of altmetrics in its Scitation publication platform, says product manager Doreen Hall. “It’s one of many things we’re looking at as we consult with librarians and researchers to find out how they value these metrics and to what degree.”Plum Analytics, which has offices in Massachusetts and Washington State, collects altmetrics for various scholarly works, including articles, videos, books, and computer code. Entrepreneur Andrea Michalek and librarian Mike Buschman founded the company in 2012; it was acquired by EBSCO Information Services in 2013 for an undisclosed sum. Customers of the company’s paid aggregation tool include the University of Pittsburgh, the Sanford–Burnham Medical Research Institute, and the research-funding organization Autism Speaks.Michalek says that Plum Analytics shies away from calculating aggregated scores or indicators, which often end up as “vanity metrics” that rely too heavily on social media or on too small a data set. “Telling the stories and answering the questions about research output— which is Plum Analytics’ goal—requires a more nuanced view than a raw score can provide,” she says.The stated goal of ImpactStory is “to uncover and share every impact from every research product of every scientist.” Founded in 2012 by information scientist Heather Piwowar and altmetrics manifesto coauthor Jason Priem, the company collects varied altmetrics, including tweets, Wikipedia citations, and YouTube views. The only nonprofit of the three aggregators, ImpactStory has received more than $925 000 in grants from NSF and the Alfred P. Sloan Foundation. For additional revenue, it sells subscriptions to individuals for $10 a month or $60 a year; paid subscribers receive a profile page that displays links to their work and a sidebar of cumulative altmetric data.The open-access publisher Public Library of Science has been tracking article metrics for years, says biophysicist and PLOS advocacy director Cameron Neylon. Metrics displayed next to PLOS articles include article views, citation counts, online bookmarks, and discussions or mentions on Wikipedia and Twitter. “Where possible, we don’t just provide numbers but links to the underlying events, which are often much more interesting,” says Neylon. “Who is tweeting about an article can be much more informative than just the number of tweets.”Cornell University physicist and arXiv founder Paul Ginsparg says there are no immediate plans to display altmetrics on the repository. However, he says, “It’s been evident for over two decades that the electronic format provides many new opportunities for measuring interest and importance.” In that period, he adds, internal statistics show which articles on arXiv attract the general public’s interest: “The numbers of downloads are far higher than those coming from within the research community.” The data also show “instant spikes for articles with immediate research interest.”An h-index based on tweets?Section:ChooseTop of pageABSTRACTAggregating social impactAn h-index based on tweet... <<Unmapped territoryCITING ARTICLESThe increasing popularity of open-access journals has been central to the rise of altmetrics, says Ben Wagner, a chemistry and physics librarian at the University at Buffalo. “Some studies show that an open-access article is downloaded twice as much as ones that aren’t—and things that are open are easier to share.”Several open-access publishers are among the more than 540 scientific organizations and 12 000 individuals that have endorsed the 2012 San Francisco Declaration on Research Assessment, which calls for the elimination of journal-based hiring, promotion, and funding decisions. The declaration also advocates for the inclusion of nonarticle research output and “a broad range of impact measures” in research assessments.But altmetrics data taken out of context run the risk of being “dangerous and inaccurate,” says Rodrigo Costas Comesaña, an information scientist at Leiden University in the Netherlands. “You can calculate an h-index or impact factor based on tweets, but what would it mean?” At this point, he says, such metrics “may be useful for data mining, but probably not for evaluations.”Nevertheless, some scientists have begun to add altmetric data to their resumés and professional websites, which has heightened concerns about the potential for altmetrics to be intentionally inflated. As is the case with citation counts, says Adie, gaming altmetrics is going to happen, especially when incentives are attached. A safer approach, he notes, would be for researchers to use “a bucket of metrics, rather than just one metric from just one source.”Further research is needed to avoid the danger of altmetrics being misinterpreted or misused, says Stefanie Haustein, a postdoctoral information scientist at the University of Montreal. “A lot of events are being collected because they are easy to collect; our research will focus on giving these counts meaning.” For example, she and University of Montreal professor Vincent Larivière have found moderate positive correlation between bookmark counts on Mendeley and citation counts reported by Scopus.Unmapped territorySection:ChooseTop of pageABSTRACTAggregating social impactAn h-index based on tweet...Unmapped territory <<CITING ARTICLESEfforts to establish standards and best practices for altmetrics are in the works. Last June the Baltimore-based National Information Standards Organization (NISO) released a report that contained 25 potential action items to address concerns about raw-data transparency and reproducibility, among other things. Working groups composed of representatives from the publishing industry, academia, and other relevant sectors are expected to complete their reports in time for NISO to publish its final recommendations next fall.Attendees gather for a breakout session at the 1st Altmetrics Conference: London, held in September 2014. Organized by Altmetric LLP and the UK funding agency Wellcome Trust, the conference drew speakers and sponsorship from publishing giants Elsevier and Wiley and citation-database provider Thomson Reuters, among others.WELLCOME TRUSTPPT|High resolution“We’re pretty close to having an idea of best practices,” says William Gunn, head of academic outreach at Mendeley and a participant in the NISO altmetrics project. There’s consensus, he says, that the sources generating altmetric activity should be open and transparent, that raw data are preferable to a single score or indicator, and that the numbers should be put in context—for example, by comparing an article’s tweet count to that of similar articles. One action item under discussion seeks agreement on the proper use of the term “altmetrics,” or on using a different term. Haustein suggests “social media metrics” as one substitute. “[These metrics] are supplementary, not alternatives to citations,” she says. “They won’t replace citations.”In an October 2014 blog post titled “Altmetrics: What are they good for?” PLOS’s Neylon wrote that altmetric data are “proxies of things we don’t truly understand … and signals of the flow of information down paths that we haven’t mapped.” Patterns are emerging that “may let us determine not so much whether one piece of work is ‘better’ than another but what kind of work it is—who is finding it useful, what kinds of pathways is the information flowing down?”“There is more to research evaluation than just citations,” says University of Cambridge biochemistry postdoc Pietro Gatti-Lafranconi, an ImpactStory user and volunteer adviser. He says he likes ImpactStory because it allows him to combine “pure citation metrics with social media impact, in something that gets closer to the visibility of research.”© 2015 American Institute of Physics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call