Abstract
BackgroundRecent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object.MethodsTwo human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps.ResultsBoth subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control.ConclusionsIntegration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users.Trial registrationNCT01364480 and NCT01894802.Electronic supplementary materialThe online version of this article (doi:10.1186/s12984-016-0134-9) contains supplementary material, which is available to authorized users.
Highlights
Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function
Recent brain-machine interface (BMI) work has shown that people with tetraplegia can control robotic arms using signals recorded by intracortical electrodes [1,2,3]
We have recently shown that motor cortex signaling is context-dependent as the extracted signal changes between object grasping and free movement [3]. If this change is not taken into account, the BMI user has limited ability to control the robotic arm near an object
Summary
Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. We have recently shown that motor cortex signaling is context-dependent as the extracted signal changes between object grasping and free movement [3] If this change is not taken into account, the BMI user has limited ability to control the robotic arm near an object. BMIs for arm control do not provide somatosensory feedback for the user [1,2,3], which may impair the normal grasping process [8] Another potential barrier to optimal performance is that the visual feedback that a BMI user receives is of a robotic arm rather than their own hand, which may introduce sensory conflicts [9]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.