Abstract

AbstractBackgroundAs the world’s elderly population increases, health monitoring technologies that can automatically detect subtle changes resulting from Alzheimer’s disease (AD) have become increasingly important. In this respect, speech is one of the promising clues due to the expansion of voice‐based interaction systems such as smartphones and tablets. Indeed, previous studies have succeeded in quantifying language dysfunctions and identifying AD and mild cognitive impairment (MCI) from speech data collected during neuropsychological tests conducted by clinicians. Conducting such assessments in an automated fashion by using computer devices would extend opportunities for assessments and help with the early detection of AD. In particular, if we can detect language dysfunctions related to AD from various types of speech data (e.g., question answering and daily conversations), it would extend the scope of application and help improve the current worldwide low‐diagnosis coverage.MethodIn this study, we developed a tablet‐based application and collected spontaneous speech data during interview tasks from 106 Japanese seniors consisting of 48 healthy controls (HC), 33 MCI, and 25 AD.Participants answered nine questions relating to their current condition, yesterday’s dinner, games played as a child, and future travel plans. For comparison, we also collected speech data during neuropsychological tests (e.g., verbal fluency and picture description tasks) using the tablet. We extracted vocal and prosodic features from both speech data and then built binary classification models for differentiating MCI or AD from HC using a support vector machine with a feature selection method. We evaluated the models by leave‐one‐subject‐out cross‐validation.ResultWe found that the models using speech data during neuropsychological tests achieved the accuracy of 86.3% for HC vs. AD and 80.2% for HC vs. MCI. The models using spontaneous speech data from interview tasks achieved comparable accuracies: 87.7% for HC vs. AD and 81.6% for HC vs. MCI.ConclusionWe demonstrated the possibility of using tablet‐based automatic assessments for detecting patients with both MCI and AD. In addition, our results suggest that speech data from not only neuropsychological tasks but also when answering everyday questions might contain useful information to help early detection of AD.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call