Abstract

The early stage of Alzheimer's disease (AD) is called mild cognitive impairment (MCI). According to the final progression of the disease, it can be divided into progressive MCI (pMCI) and stable MCI (sMCI). Clinical treatment shows that early treatment of MCI stage can effectively delay the progression of the disease and even complete recovery. Therefore, accurate diagnosis of AD and its early stage play an important guiding role in the treatment and delay of AD progression. Previous studies have shown that the data of different modalities of AD can reflect different pathological changes. Therefore, the accuracy of combining multi-modality data to classify AD is better than that of using single-modality to classify AD. Based on this, an AD diagnosis method is proposed based on 3D discrete wavelet transform (3D-DWT) and 3D moment invariants (3D-MIs) features from multi-modalities images. First, automatic anatomical landmark (AAL) atlas is used to identify the brain regions of interest (ROIs) from magnetic resonance image (MRI) and positron emission tomography (PET). Next, for each ROI, the 3D-DWT and 3D-MIs extract methods are used to extract features. Finally, a deep neural network with stacked autoencoders (SAE) is trained to detect AD. The experimental results show that, compared with the state-of-the-art AD/MCI classification algorithms based on multi-modality features, this method can significantly improve the classification performance of pMCI and sMCI classification tasks, which has an important guiding role in the early diagnosis of AD.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call