Abstract
Recently proposed tractography and connectomics approaches often require a very large number of streamlines, in the order of millions. Generating, storing and interacting with these datasets is currently quite difficult, since they require a lot of space in memory and processing time. Compression is a common approach to reduce data size. Recently such an approach has been proposed consisting in removing collinear points in the streamlines. Removing points from streamlines results in files that cannot be robustly post-processed and interacted with existing tools, which are for the most part point-based. The aim of this work is to improve visualization, interaction and tractometry algorithms to robustly handle compressed tractography datasets. Our proposed improvements are threefold: (i) An efficient loading procedure to improve visualization (reduce memory usage up to 95% for a 0.2 mm step size); (ii) interaction techniques robust to compressed tractograms; (iii) tractometry techniques robust to compressed tractograms to eliminate biased in tract-based statistics. The present work demonstrates the need of correctly handling compressed streamlines to avoid biases in future tractometry and connectomics studies.
Highlights
Diffusion magnetic resonance imaging tractography has greatly helped to advance the understanding of brain structural connectivity (Clayden, 2013) as it is the only non-invasive technique used to reconstruct white matter pathways (Catani and de Schotten, 2012)
For an analysis offering a fair comparison to deterministic tracking, a 0.2 mm linearization threshold was employed for probabilistic streamlines
The maximum linearization distance only having a small effect on compression rate, the same value can be used for deterministic and probabilistic tractograms
Summary
Diffusion magnetic resonance imaging (dMRI) tractography has greatly helped to advance the understanding of brain structural connectivity (Clayden, 2013) as it is the only non-invasive technique used to reconstruct white matter pathways (Catani and de Schotten, 2012). Tractography reconstruction algorithms produce outputs of hundreds of thousands to tens of millions of streamlines. The term streamline is used to designate the contiguous set of 3D points produced by tractography algorithms. A large quantity of streamlines generated from a single subject is called a tractogram. A typical dataset, generated using high angular resolution diffusion imaging (HARDI) tractography with a 0.5 mm step size and containing 700 k streamlines, is ∼1 gigabyte (GB) on disk. When loaded in random access memory (RAM) in a visualization tool, RAM usage can go up to 7 GB depending on the datatype and internal data structures (Presseau et al, 2015)
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have