Abstract
AbstractDecoding brain states under different task conditions from functional magnetic resonance imaging (tfMRI) data has attracted more and more attentions in neuroimaging studies. Although various methods have been developed, existing methods do not fully consider the temporal dependencies between adjacent fMRI data points which limits the model performance. In this paper, we propose a novel group deep bidirectional recurrent neural network (Group-DBRNN) model for decoding task sub-type states from individual fMRI volume data points. Specifically, we employed the bidirectional recurrent neural network layer to characterize the temporal dependency feature from both directions effectively. We further developed a multi-task interaction layer (MTIL) to effectively capture the latent temporal dependencies of brain sub-type states under different tasks. Besides, we modified the training strategy to train the classification model in group data fashion for the individual task. The basic idea is that relational tfMRI data may provide external information for brain decoding. The proposed Group-DBRNN model has been tested on the task fMRI datasets of HCP 900 subject’s release, and the average classification accuracy of 24 sub-type brain states is as high as 91.34%. The average seven-task classification accuracy is 95.55% which is significantly higher than other state-of-the-art methods. Extensive experimental results demonstrated the superiority of the proposed Group-DBRNN model in automatically learning the discriminative representation features and effectively distinguishing brain sub-type states across different task fMRI datasets.KeywordsBrain decodingBrain statesRNNDeep learningFunctional magnetic resonance imaging
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.