Studies using eye-tracking methodology have made important contributions to the study of language disorders such as aphasia. Nevertheless, in clinical groups especially, eye-tracking studies often include small sample sizes, limiting the generalizability of reported findings. Online, webcam-based tracking offers a potential solution to this issue, but web-based tracking has not been compared with in-lab tracking in past studies and has never been attempted in groups with language impairments. Patients with post-stroke aphasia (n=16) and age-matched controls (n=16) completed identical sentence-picture matching tasks in the lab (using an EyeLink system) and on the web (using WebGazer.js), with the order of sessions counterbalanced. We examined whether web-based eye tracking is as sensitive as in-lab eye tracking in detecting group differences in sentence processing. Patients were less accurate and slower to respond to all sentence types than controls. Proportions of gazes to the target and foil picture were computed in 100ms increments, which showed that the two modes of tracking were comparably sensitive to overall group differences across different sentence types. Web tracking showed comparable fluctuations in gaze proportions to target pictures to lab tracking in most analyses, whereas a delay of approximately 500-800ms appeared in web compared to lab data. Web-based eye tracking is feasible to study impaired language processing in aphasia and is sensitive enough to detect most group differences between controls and patients. Given that validations of webcam-based tracking are in their infancy and how transformative this method could be to several disciplines, much more testing is warranted.
Read full abstract