Abstract

BackgroundThere is a growing need for analyzing medical data such as brain connectomes. However, the unavailability of large-scale training samples increases risks of model over-fitting. Recently, deep learning (DL) architectures quickly gained momentum in synthesizing medical data. However, such frameworks are primarily designed for Euclidean data (e.g., images), overlooking geometric data (e.g., brain connectomes). A few existing geometric DL works that aimed to predict a target brain connectome from a source one primarily focused on domain alignment and were agnostic to preserving the connectome topology. New methodTo address the above limitations, firstly, we adapt the graph translation generative adversarial network (GT GAN) architecture to brain connectomic data. Secondly, we extend the baseline GT GAN to a cyclic graph translation (CGT) GAN, allowing bidirectional brain network translation between the source and target views. Finally, to preserve the topological strength of brain regions of interest (ROIs), we impose a topological strength constraint on the CGT GAN learning, thereby introducing CGTS GAN architecture. Comparison with existing methodsWe compared CGTS with graph translation methods and its ablated versions. ResultsOur deep graph network outperformed the baseline comparison method and its ablated versions in mean squared error (MSE) using multiview autism spectrum disorder connectomic dataset. ConclusionWe designed a topology-aware bidirectional brain connectome synthesis framework rooted in geometric deep learning, which can be used for data augmentation in clinical diagnosis.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.