Abstract
Modern brain-machine interfaces can return function to people with paralysis, but current upper extremity brain-machine interfaces are unable to reproduce control of individuated finger movements. Here, for the first time, we present a real-time, high-speed, linear brain-machine interface in nonhuman primates that utilizes intracortical neural signals to bridge this gap. We created a non-prehensile task that systematically individuates two finger groups, the index finger and the middle-ring-small fingers combined. During online brain control, the ReFIT Kalman filter could predict individuated finger group movements with high performance. Next, training ridge regression decoders with individual movements was sufficient to predict untrained combined movements and vice versa. Finally, we compared the postural and movement tuning of finger-related cortical activity to find that individual cortical units simultaneously encode multiple behavioral dimensions. Our results suggest that linear decoders may be sufficient for brain-machine interfaces to execute high-dimensional tasks with the performance levels required for naturalistic neural prostheses.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.