Abstract

BackgroundRecent technological advances in brain recording and machine learning algorithms are enabling the study of neural activity underlying spontaneous human behaviors, beyond the confines of cued, repeated trials. However, analyzing such unstructured data lacking a priori experimental design remains a significant challenge, especially when the data is multi-modal and long-term. New methodHere we describe an automated, behavior-first approach for analyzing simultaneously recorded long-term, naturalistic electrocorticography (ECoG) and behavior video data. We identify and characterize spontaneous human upper-limb movements by combining computer vision, discrete latent-variable modeling, and string pattern-matching on the video. ResultsOur pipeline discovers and annotates over 40,000 instances of naturalistic arm movements in long term (7–9 day) behavioral videos, across 12 subjects. Analysis of the simultaneously recorded brain data reveals neural signatures of movement that corroborate previous findings. Our pipeline produces large training datasets for brain–computer interfacing applications, and we show decoding results from a movement initiation detection task. Comparison with existing methodsSpontaneous movements capture real-world neural and behavior variability that is missing from traditional cued tasks. Building beyond window-based movement detection metrics, our unsupervised discretization scheme produces a queryable pose representation, allowing localization of movements with finer temporal resolution. ConclusionsOur work addresses the unique analytic challenges of studying naturalistic human behaviors and contributes methods that may generalize to other neural recording modalities beyond ECoG. We publish our curated dataset and believe that it will be a valuable resource for future studies of naturalistic movements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call