Abstract

In order for brain-computer interface (BCI) systems to maximize functionality, users will need to be able to accurately modulate grasp force to avoid dropping heavy objects while also being able to handle fragile items. We present a case-study consisting of two experiments designed to identify whether intracortical recordings from the motor cortex of a person with tetraplegia could predict intended grasp force. In the first task, we were able classify neural responses to attempted grasps of four objects, each of which required similar grasp kinematics but different implicit grasp force targets, with 69% accuracy. In the second task, the subject attempted to move a virtual robotic arm in space to grasp a simple virtual object. For each trial, the subject was asked to grasp the virtual object with the force appropriate for one of the four objects from the first experiment, with the goal of measuring an implicit representation of grasp force. While the subject knew the grasp force during all phases of the trial, accurate classification was only achieved during active grasping, not while the hand moved to, transported, or released the object. In both tasks, misclassifications were most often to the object with an adjacent force requirement. In addition to the implications for understanding the representation of grasp force in motor cortex, these results are a first step toward creating intelligent algorithms to help BCI users grasp and manipulate a variety of objects that will be encountered in daily life.Clinical Trial Identifier: NCT01894802 https://clinicaltrials.gov/ct2/show/NCT01894802.

Highlights

  • Brain-computer interfaces (BCI) have shown promise as assistive devices to restore a level of independence to people with tetraplegia (Collinger et al, 2012, 2013; Hochberg et al, 2012; Wodlinger et al, 2014; Blabe et al, 2015; Ajiboye et al, 2017)

  • We show that implicit grasp force is represented in M1 during attempted grasp, but not during observation, of objects of varying compliance, weight and texture

  • While previous studies identified grasp force information in EEG signals (Rearick et al, 2001; Murguialday et al, 2007; Paek et al, 2015; Wang et al, 2017) and intracortical field potentials of motor-intact subjects (Flint et al, 2014; Murphy et al, 2016), here we show that this information is present in multi-unit recordings of a subject with tetraplegia

Read more

Summary

Introduction

Brain-computer interfaces (BCI) have shown promise as assistive devices to restore a level of independence to people with tetraplegia (Collinger et al, 2012, 2013; Hochberg et al, 2012; Wodlinger et al, 2014; Blabe et al, 2015; Ajiboye et al, 2017). Implicit Grasp Force in M1 grasp postures and to handle objects that vary in weight, compliance, or fragility. We explore the representation of implicit grasp force from extracellular recordings in M1 of a single BCI user with tetraplegia, who is physically unable to generate overt grasping movements. We evaluate whether the visual presentation of objects of varying compliance and weight can elicit discriminable patterns of activity. We investigate whether grasp force-related information is present during a multi-phase object transport task where no visual feedback about object identity is provided. M1 recordings clearly discriminated between objects with different force requirements, though this information was not present during reaching and object transport

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.