BackgroundEye-movement can reflect cognition and provide information on the neurodegeneration, such as Alzheimer’s disease (AD). The high cost and limited accessibility of eye-movement recordings have hindered their use in clinics.AimsWe aim to develop an AI-driven eye-tracking tool for assessing AD using mobile devices with embedded cameras.Methods166 AD patients and 107 normal controls (NC) were enrolled. The subjects completed eye-movement tasks on a pad. We compared the demographics and clinical features of two groups. The eye-movement features were selected using least absolute shrinkage and selection operator (LASSO). Logistic regression (LR) model was trained to classify AD and NC, and its performance was evaluated. A nomogram was established to predict AD.ResultsIn training set, the model showed a good area under curve (AUC) of 0.85 for identifying AD from NC, with a sensitivity of 71%, specificity of 84%, positive predictive value of 0.87, and negative predictive value of 0.65. The validation of the model also yielded a favorable discriminatory ability with the AUC of 0.91, sensitivity, specificity, positive predictive value, and negative predictive value of 82%, 91%, 0.93, and 0.77 to identify AD patients from NC.Discussion and ConclusionsThis novel AI-driven eye-tracking technology has the potential to reliably identify differences in eye-movement abnormalities in AD. The model shows excellent diagnostic performance in identifying AD based on the current data collected. The use of mobile devices makes it accessible for AD patients to complete tasks in primary clinical settings or follow up at home.
Read full abstract