Abstract

The processing of brain diffusion tensor imaging (DTI) data for large cohort studies requires fully automatic pipelines to perform quality control (QC) and artifact/outlier removal procedures on the raw DTI data prior to calculation of diffusion parameters. In this study, three automatic DTI processing pipelines, each complying with the general ENIGMA framework, were designed by uniquely combining multiple image processing software tools. Different QC procedures based on the RESTORE algorithm, the DTIPrep protocol, and a combination of both methods were compared using simulated ground truth and artifact containing DTI datasets modeling eddy current induced distortions, various levels of motion artifacts, and thermal noise. Variability was also examined in 20 DTI datasets acquired in subjects with vascular cognitive impairment (VCI) from the multi-site Ontario Neurodegenerative Disease Research Initiative (ONDRI). The mean fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD), and radial diffusivity (RD) were calculated in global brain grey matter (GM) and white matter (WM) regions. For the simulated DTI datasets, the measure used to evaluate the performance of the pipelines was the normalized difference between the mean DTI metrics measured in GM and WM regions and the corresponding ground truth DTI value. The performance of the proposed pipelines was very similar, particularly in FA measurements. However, the pipeline based on the RESTORE algorithm was the most accurate when analyzing the artifact containing DTI datasets. The pipeline that combined the DTIPrep protocol and the RESTORE algorithm produced the lowest standard deviation in FA measurements in normal appearing WM across subjects. We concluded that this pipeline was the most robust and is preferred for automated analysis of multisite brain DTI data.

Highlights

  • Diffusion tensor imaging (DTI) is a well-established magnetic resonance imaging (MRI) technique, sensitive to the microstructural organization of cerebral tissue constituents [1,2]

  • We present, validate, and compare three fully automatic ENIGMAbased pipelines for processing brain DTI data by effectively connecting multiple image processing tools

  • All three pipelines successfully produced parametric maps of fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD), and radial diffusivity (RD) after processing the raw data without noise and artefacts that was comparable to the ground truth DTI metric maps obtained directly from ground truth data through tensor fitting without any further processing (Fig 2)

Read more

Summary

Introduction

Diffusion tensor imaging (DTI) is a well-established magnetic resonance imaging (MRI) technique, sensitive to the microstructural organization of cerebral tissue constituents [1,2]. Quality control (QC) procedures to remove outliers (e.g. distorted gradient volumes) from the analysis, modify voxel-wise diffusion tensor components, and harmonize data have been identified as necessary steps in DTI analysis pipelines [5,8,9,10,12,13,14,15,16]. The latter step, is relevant when analyzing large multisite and/or multiscanner DTI datasets [9,15,16,17], including those from the UK Biobank project [5,18,19,20] and the Ontario Neurodegenerative Disease Research Initiative (ONDRI) [21]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call