Abstract

Tourette syndrome (TS) tics are typically quantified using "paper and pencil" rating scales that are susceptible to factors that adversely impact validity. Video-based methods to more objectively quantify tics have been developed but are challenged by reliance on human raters and procedures that are resource intensive. Computer vision approaches that automate detection of atypical movements may be useful to apply to tic quantification. The current proof-of-concept study applied a computer vision approach to train a supervised deep learning algorithm to detect eye tics in video, the most common tic type in patients with TS. Videos (N = 54) of 11 adolescent patients with TS were rigorously coded by trained human raters to identify 1.5-second clips depicting "eye tic events" (N = 1775) and "non-tic events" (N = 3680). Clips were encoded into three-dimensional facial landmarks. Supervised deep learning was applied to processed data using random split and disjoint split regimens to simulate model validity under different conditions. Area under receiver operating characteristic curve was 0.89 for the random split regimen, indicating high accuracy in the algorithm's ability to properly classify eye tic vs. non-eye tic movements. Area under receiver operating characteristic curve was 0.74 for the disjoint split regimen, suggesting that algorithm generalizability is more limited when trained on a small patient sample. The algorithm was successful in detecting eye tics in unseen validation sets. Automated tic detection from video is a promising approach for tic quantification that may have future utility in TS screening, diagnostics, and treatment outcome measurement. © 2023 The Authors. Movement Disorders published by Wiley Periodicals LLC on behalf of International Parkinson and Movement Disorder Society.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.