Abstract

Medical physics is a scientific discipline strongly related to quality assurance of medical devices, optimization of diagnostic imaging and absorbed dose delivery in external or internal radiotherapy. These different tasks share a common paradigm of data collection and treatment in a heavy computational environment. Moreover, these tasks are often performed within a working group where information sharing and data reporting play a key role for the results’ quality (knowledge transmission, cross validation, …). Experimental results replication is a corner stone of all ’so called’ experimental scientific discipline. Nonetheless, in some cases, experimental replication is impossible (climatology, high costs, …) and reproducibility of results from acquired data and treatments is a valuable option. Recently, in several scientific domains, several published results by some authors one team were not able to be reproduced by others. In some cases, some were forced to retract their publications as some errors were found, leading to opposite conclusions [ [1] Prinz F et al. - 2011 - Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov 10: 712. Google Scholar ]. Reasons are multiple. Spreadsheets for computational data treatment or manual data modifications were one of the most common source of errors. Indeed, in the era of ’all-computer’, writing laboratory notebook is unfortunately a vanishing self-discipline, replaced by diverse binary documents (textual, calculus, images, …) spread over different locations in one or several hard drive. This situation becomes rapidly a headache in a collaborative environment where keeping track of version is a particular issue. Several researchers, aware of those problems, published recently [ [2] Kitzes, J., Turek, D., & Deniz, F. (Eds.). (2018). The practice of reproducible research: case studies and lessons from the data-intensive sciences. Oakland, CA: University of California Press. Google Scholar ] some methodological approaches for reducing the footprint of this almost exclusive (ab) use of computing tools. Paradoxically, they emphasize the use of computing tools borrowed from software development engineering. This presentation aims in one hand at showing this ”reproducible science” approach in broad lines and some examples for illustration purposes and in the other hand at pointing to valuable resources to implement this approach. Medical physics is a scientific discipline strongly related to quality assurance of medical devices, optimization of diagnostic imaging and absorbed dose delivery in external or internal radiotherapy. These different tasks share a common paradigm of data collection and treatment in a heavy computational environment. Moreover, these tasks are often performed within a working group where information sharing and data reporting play a key role for the results’ quality (knowledge transmission, cross validation, …). Experimental results replication is a corner stone of all ’so called’ experimental scientific discipline. Nonetheless, in some cases, experimental replication is impossible (climatology, high costs, …) and reproducibility of results from acquired data and treatments is a valuable option. Recently, in several scientific domains, several published results by some authors one team were not able to be reproduced by others. In some cases, some were forced to retract their publications as some errors were found, leading to opposite conclusions [ [1] Prinz F et al. - 2011 - Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov 10: 712. Google Scholar ]. Reasons are multiple. Spreadsheets for computational data treatment or manual data modifications were one of the most common source of errors. Indeed, in the era of ’all-computer’, writing laboratory notebook is unfortunately a vanishing self-discipline, replaced by diverse binary documents (textual, calculus, images, …) spread over different locations in one or several hard drive. This situation becomes rapidly a headache in a collaborative environment where keeping track of version is a particular issue. Several researchers, aware of those problems, published recently [ [2] Kitzes, J., Turek, D., & Deniz, F. (Eds.). (2018). The practice of reproducible research: case studies and lessons from the data-intensive sciences. Oakland, CA: University of California Press. Google Scholar ] some methodological approaches for reducing the footprint of this almost exclusive (ab) use of computing tools. Paradoxically, they emphasize the use of computing tools borrowed from software development engineering. This presentation aims in one hand at showing this ”reproducible science” approach in broad lines and some examples for illustration purposes and in the other hand at pointing to valuable resources to implement this approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call