Abstract

Autism Spectrum Disorder (ASD) is a widely prevalent neurodevelopmental disorder with symptoms of social interaction and communication problems and restrictive and repetitive behavior. In recent years, there has been increasing interest in identifying individuals with ASD patients from typical developing (TD) ones based on brain imaging data such as MRI. Although both traditional machine learning and recent deep learning methodologies have achieved promising performance, the classification accuracy is still far from satisfying due to large individual differences and/or heterogeneity among data from different sites. To help resolve the problem, we proposed a novel Attention-based Node-Edge Graph Convolutional Network (ANEGCN) to identify ASD from TD individuals. Specifically, it simultaneously models the features of nodes and edges in the graphs and combines multi-modal MRI data including structural MRI and resting state functional fMRI in order to utilize both structural and functional information for feature extraction and classification. Moreover, an adversarial learning strategy was used to enhance the model generalizability. A gradient-based model interpretability method was also applied to identify those brain regions and connections contributing to the classification. Using the worldwide Autism Brain Imaging Data Exchange I (ABIDE I) dataset with 1007 subjects from 17 sites, the proposed ANEGCN achieved superior classification accuracy (72.7%) and generalizability than other state-of-the-art models. This study provided a powerful tool for ASD identification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.