Abstract

This work proposes a decision-aid tool for detecting Alzheimer’s disease (AD) at an early stage, based on the Archimedes spiral, executed on a Wacom digitizer. Our work assesses the potential of the task as a dynamic gesture and defines the most pertinent methodology for exploiting transfer learning to compensate for sparse data. We embed directly in spiral trajectory images, kinematic time functions. With transfer learning, we perform automatic feature extraction on such images. Experiments on 30 AD patients and 45 healthy controls (HC) show that the extracted features allow a significant improvement in sensitivity and accuracy, compared to raw images. We study at which level of the deep network features have the highest discriminant capabilities. Results show that intermediate-level features are the best for our specific task. Decision fusion of experts trained on such descriptors outperforms low-level fusion of hybrid images. When fusing decisions of classifiers trained on the best features, from pressure, altitude, and velocity images, we obtain 84% of sensitivity and 81.5% of accuracy, achieving an absolute improvement of 22% in sensitivity and 7% in accuracy. We demonstrate the potential of the spiral task for AD detection and give a complete methodology based on off-the-shelf features.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.