Abstract

AbstractBackgroundLarge multi‐center, longitudinal neuroimaging projects aiming to understand neurodegenerative disorders, such as Alzheimer disease (AD), frequently generate large datasets for use by external researchers. However, maintaining a replicable data processing workflow in an on‐going study can be difficult due to changes in personnel, computer or software upgrades, and disparate computing resources across collaborating institutions. Replicating workflows from prior studies is also important for validating and building scientific knowledge but can present similar challenges. Containerization is an increasingly popular solution to these issues. This process allows users to package their code, software, and any dependencies together, ensuring stable and consistent output across computing platforms. The main goal of this project is to guide researchers through the steps of custom container development for their own needs.MethodWe demonstrate the process of containerization using a neuroimaging workflow that quantifies edge density (ED), an emerging metric of white matter integrity. To calculate ED, structural and diffusion magnetic resonance images were processed using standard FreeSurfer (5.3) and FSL (6.0) procedures. Whole‐brain probabilistic tractography, ED metrics, and diffusion tensor indices were computed. A template for building a container image for neuroimaging use was developed. Scripts used for analyses were incorporated into the template and compiled as reusable container images using Docker and Apptainer. Additional scripts and documentation were released to facilitate custom implementation.ResultContainerized tractography, ED metrics, and tensor indices were identical across multiple computing systems and in‐line with prior research showing white matter integrity begins to decline prior to outward cognitive symptoms. The provided code and documentation outlines the containerization process for our workflow (Figure 1) and describes how to run the resulting container image using data from the recently updated Open Access Series of Imaging Studies 3 dataset (OASIS‐3) as an example.ConclusionContainerizing a neuroimaging processing workflow ensures stable results across the lifespan of a study and facilitates replication of prior workflows in new datasets. The containerized ED pipeline provided enables other researchers to replicate or extend upon our analyses in this cohort. The provided documentation and templates can also be adapted for use with other datasets that use common neuroimaging file formats.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call