Abstract

Image data are universal in life sciences research. Their proper handling is not. A significant proportion of image data in research papers show signs of mishandling that undermine their interpretation. We propose that a precise description of the image processing and analysis applied is required to address this problem. A new norm for reporting reproducible image analyses will diminish mishandling, as it will alert co‐authors, referees, and journals to aberrant image data processing or, if published nonetheless, it will document it to the reader. To promote this norm, we discuss the effectiveness of this approach and give some step‐by‐step instructions for publishing reproducible image data processing and analysis workflows.

Highlights

  • This page was generated automatically upon download from the ETH Zurich Research Collection

  • We propose that a precise description of the image processing and analysis applied is required to address this problem

  • Among 99 biomedical papers that were labeled with the “Editorial expression of concern”, 40% had issues with image data (Vaught et al, 2017). 760,000 papers sampled from PubMed Open Access

Read more

Summary

Methods reproducibility

The original meaning of reproducibility (Claerbout & Karrenbach, 1992). The ability to obtain the exact same results, by implementing procedures using the same data and tools. Panels D and E are again 8-bit images, created by letting ImageJ/Fiji automatically determine their individual display ranges, and making the conversion; mimicking the procedure used if two original images are opened and converted independent of each other This result is similar, but not generally identical, to what we would find if we had applied auto-contrast or histogram normalization on the two cropped images independent of each other (Fig 1). We explain practical steps that can be taken to publish reproducible image analysis workflow These recommendations are what we currently think are the best in terms of openness, popularity, and accessibilities, but we welcome criticism, suggestions, and advice through discussions in the online forum. A Highest degree of reproducibility supported a Description (fulfills all or most): Widespread; free; GUI workflows recordable; full scripting capability in one or more common programming languages b Examples: Fiji/ImageJ (Schindelin et al, 2012); ICY (de Chaumont et al, 2012); CellProfiler (McQuin et al, 2018); ilastik (Berg et al, 2019); QuPath (Bankhead et al, 2017); Python; R (Ripley, 2001); fully documented and self-contained code in a public repository

C Not or almost not supportive of reproducibility a Description
Use Web Services
Use Notebooks
Outline of the workflow
Instruction for reproducing the workflow
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call