Abstract
Detecting Alzheimer's disease (AD) accurately at an early stage is critical for planning and implementing disease-modifying treatments that can help prevent the progression to severe stages of the disease. In the existing literature, diagnostic test scores and clinical status have been provided for specific time points, and predicting the disease progression poses a significant challenge. However, few studies focus on longitudinal data to build deep-learning models for AD detection. These models are not stable to be relied upon in real medical settings due to a lack of adaptive training and testing. We aim to predict the individual's diagnostic status for the next six years in an adaptive manner where prediction performance improves with the number of patient visits. This study presents a Sequence-Length Adaptive Encoder-Decoder Long Short-Term Memory (SLA-ED LSTM) deep-learning model on longitudinal data obtained from the Alzheimer's Disease Neuroimaging Initiative archive. In the suggested approach, decoder LSTM dynamically adjusts to accommodate variations in training sequence length and inference length rather than being constrained to a fixed length. We evaluated the model performance for various sequence lengths and found that for inference length one, sequence length nine gives the highest average test accuracy and area under the receiver operating characteristic curves of 0.920 and 0.982, respectively. This insight suggests that data from nine visits effectively captures meaningful cognitive status changes and is adequate for accurate model training. We conducted a comparative analysis of the proposed model against state-of-the-art methods, revealing a significant improvement in disease progression prediction over the previous methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.