Abstract

Watch the VIDEO here.Within the Open Science discussions, the current call for “reproducibility” comes from the raising awareness that results as presented in research papers are not as easily reproducible as expected, or even contradicted those original results in some reproduction efforts. In this context, transparency and openness are seen as key components to facilitate good scientific practices, as well as scientific discovery. As a result, many funding agencies now require the deposit of research data sets, institutions improve the training on the application of statistical methods, and journals begin to mandate a high level of detail on the methods and materials used. How can researchers be supported and encouraged to provide that level of transparency? An important component is the underlying research data, which is currently often only partly available within the article. At Elsevier we have therefore been working on journal data guidelines which clearly explain to researchers when and how they are expected to make their research data available. Simultaneously, we have also developed the corresponding infrastructure to make it as easy as possible for researchers to share their data in a way that is appropriate in their field. To ensure researchers get credit for the work they do on managing and sharing data, all our journals support data citation in line with the FORCE11 data citation principles – a key step in the direction of ensuring that we address the lack of credits and incentives which emerged from the Open Data analysis (Open Data - the Researcher Perspective https://www.elsevier.com/about/open-science/research-data/open-data-report ) recently carried out by Elsevier together with CWTS. Finally, the presentation will also touch upon a number of initiatives to ensure the reproducibility of software, protocols and methods. With STAR methods, for instance, methods are submitted in a Structured, Transparent, Accessible Reporting format; this approach promotes rigor and robustness, and makes reporting easier for the author and replication easier for the reader.

Highlights

  • While all data sets were available two years after publication, the odds of obtaining the underlying data dropped by 17 per cent per year after that, they reported

  • Phil Bourne published a challenge to reproduce this in 2011; Yolanda Gil took up the challenge in 2012 Spent280 hrs interviewing researchers, rerunning scripts, fixing broken code Experiment could be reproduced, but required:

  • Answer: this is possible if the researchers would have done their work within the same workflow, using an open ecosystem of interconnected tools

Read more

Summary

Ways to support reproducibility

To reduce bias against negative results, to avoid changing study parameters after data collection (incl. prevention of p-hacking), to improve reproducibility, and to reward great ideas.

Old vs New Ways of Reproducing Research
Mendeley Data Monitor
Providing a statement when data cannot be shared
Publishing data articles
Publishing open peer review reports
Thank you!
Implementing the data guidelines
Who is responsible for acting on data management plans?
Findings
Fully documented datasets shared in Mendeley
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.