Abstract
Fiber tractography aims to reconstruct white matter (WM) connections in the brain. Challenges in these reconstructions include estimation of the fiber orientations in regions with multiple fiber populations, and the uncertainty in the fiber orientations as a result of noise. In this work, we use a range of multi-tensor models to cope with crossing fibers. The uncertainty in fiber orientation is captured using the Cramer-Rao lower bound. Furthermore, model selection is performed based on model complexity and goodness of fit. The performance of the framework on the fibercup phantom and human data was compared to the open source diffusion MRI toolkit Camino for a range of SNRs. Performance was quantified by using the Tractometer measures in the fibercup phantom and by comparing streamline counts of lateral projections of the corpus callosum (CC) in the human data. On the phantom data, the comparison showed that our method performs similar to Camino in crossing fiber regions, whilst performing better in a region with kissing fibers (median angular error of 0.73∘ vs 2.7∘, valid connections of 57% vs 21% when seed is in the corresponding region of interest). Furthermore, the amount of counts in the lateral projections was found to be higher using our method (19–89% increase depending on a subject). Altogether, our method outperforms the reference method on both phantom and human data allowing for in-vivo probabilistic multi fiber tractography with an objective model selection procedure.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.