Abstract

ABSTRACT Alzheimer's Disease (AD) is an incurable neurodegenerative disorder that affects millions of older people worldwide. Compared to conventional methods, neuroimaging modalities along with the machine learning techniques detect the onset of AD more successfully. It has been established that multimodal classification provides better accuracy than single modal classification. Exploring the synergy between several multimodal neuroimages remains challenging due to the lack of available fusion techniques. The proposed Deep Multimodal Squeeze and Excitation network (DMSENet) uses the ResNet Squeeze and Excitation (SE) block to extract relevant features from MRI and PET images. Using the hierarchical fusion method, the extracted features are fused; subsequently, using the fused Feature Map (FM), the ResNet SE block retrieves additional higher-level and lower-level features. Hierarchical fusion methodology ensures the efficiency of Multimodal Fusion (MMF); the Attention Model(AM) then assigns the fusion ratio automatically by prioritizing the multimodal data. Moreover, the depth and efficiency of the attention network are ensured by the combination of identity mapping and Residual Block. In the DMSENet, both higher-level and lower-level features could be utilized simultaneously by employing both early and late fusion methodologies. The suggested framework is investigated using the ADNI dataset, providing greater precision than state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.