The media have been reporting on what many scientists consider to be a crisis in reproducibility, a pillar of the scientific method and the basis for scientists’ accepting or rejecting one another’s findings. University of Miami breast cancer biologist Elizabeth Iorns is one of those concerned scientists. She leads the Science Exchange researcher-matching portal and founded the Reproducibility Initiative, a new alliance of journals and scientists that promote reproducibility. Iorns describes reproducibility as “the ability to repeat experiments using the same experiment’s protocols and produce the same results.” The 19 October 2013 issue of The Economist had two articles on reproducibility, with one saying, “Modern scientists are doing too much trusting and not enough verifying—to the detriment of the whole of science, and of humanity.” In a Scientific American blog entry that same month, the lead investigator of a bacteriology study explained her team’s widely noted public retraction after they failed to reproduce their own findings. She added, “If errors are ignored, they perpetuate in the literature or in the media, which can slow scientific progress and sometimes directly harm human health.” Recently, a group of scientists from five US universities and the Florida Museum of Natural History examined reproducibility in phylogenetics, the branch of science that describes evolutionary relationships. They published their findings in the September 2013 issue of PLOS Biology and offered a general approach that might help other branches of science, as well. In the article, “Lost branches on the tree of life,” the team noted the lack of data sharing among scientists. The team also emphasized that the complex and expanding world of data (often referred to as big data) offered both challenges and opportunities for reproducibility. They are not alone in their findings. Earlier in 2013, another team looked at research on population genetics in The Journal of the Federation of American Societies for Experimental Biology (FASEB Journal), and came up with similar observations on scientists’ data sharing. Tim Vines, a University of British Columbia biologist, Molecular Biology’s managing editor, and an author of the FASEB Journal piece, says that the increasing pressure to publish “has led people to publish papers they probably would not publish 30 years ago because they were not absolutely sure of the results.” The authors of the PLOS Biology study surveyed a dozen years of journal publications, examining more than 7500 peer-reviewed papers and contacting more than 1000 authors, with approximately 350 authors responding. They discovered that researchers have inconsistent standards for data sharing, that they do not adequately archive their research, and that many were dismissive of or nonresponsive to requests about their findings. But without that requested data, the PLOS Biology team notes, other researchers cannot replicate the studies. One essential type of data for phylogenetics is called alignments, softwarecompared sets of DNA sequences that establish genetic and evolutionary connections, which are then used for constructing phylogenetic trees. Clark University postdoc biologist Romina Gazis, one of 10 coauthors of the PLOS Biology paper, says that alignments provide vital information about the experiment’s methodology, revealing what algorithms were used to process the data and how the data were manipulated. The paper’s authors found researchers especially resistant to sharing alignments. A few journals require that authors make alignments and trees available in public repositories. The PLOS Biology authors propose that all journals mandate that alignments, trees, and other essential data be made public. Moreover, they suggest that funding agencies require that the data be made public. “We are proposing that the data responsible for the conclusion of a paper be deposited in a public data repository,” says University of Florida evolutionary biologist Bryan Drew, the article’s contact author. “If you don’t have the data, you can’t reproduce a study.” The PLOS Biology study points out why researchers might resist sharing data. Noting that not all the repositories of genetic data have easy-to-use download systems, the authors propose user-friendly methods. They also note that some scientists might resent providing data for which they are not given proper credit. This could be ameliorated by developing and standardizing a data-deposition metric that would indicate when and where a researcher’s data is used; this information could be listed on biographical documents, such as curricula vitae. The authors’ recommendations are echoed by other scientists. “I think we need to change the paradigm as to how papers are evaluated,” Vines says. “If authors are not sufficiently confident in their results that they are withholding data, then those papers should be deemed less important by journals... The [paper] that makes data available is of more use to future research.”