Abstract

Depression is a challenge to diagnose reliably and the current gold standard for trials of DSM-5 has been in agreement between two or more medical specialists. Research studies aiming to objectively predict depression have typically used brain scanning. Less expensive methods from cognitive neuroscience may allow quicker and more reliable diagnoses, and contribute to reducing the costs of managing the condition. In the current study we aimed to develop a novel inexpensive system for detecting elevated symptoms of depression based on tracking face and eye movements during the performance of cognitive tasks. In total, 75 participants performed two novel cognitive tasks with verbal affective distraction elements while their face and eye movements were recorded using inexpensive cameras. Data from 48 participants (mean age 25.5 years, standard deviation of 6.1 years, 25 with elevated symptoms of depression) passed quality control and were included in a case-control classification analysis with machine learning. Classification accuracy using cross-validation (within-study replication) reached 79% (sensitivity 76%, specificity 82%), when face and eye movement measures were combined. Symptomatic participants were characterised by less intense mouth and eyelid movements during different stages of the two tasks, and by differences in frequencies and durations of fixations on affectively salient distraction words. Elevated symptoms of depression can be detected with face and eye movement tracking during the cognitive performance, with a close to clinically-relevant accuracy (~80%). Future studies should validate these results in larger samples and in clinical populations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.