Abstract

AbstractIn this paper, a human-machine shared control strategy is developed for the navigation of a wheelchair. The shared controller switches between a brain-machine control mode and an autonomous control mode. In the brain-machine control mode, a novel brain-machine interface (BMI) using only two command signals produced by steady state visual evoked potentials (SSVEP) instead of traditional four-direction command signals is developed. These two brain signals are involved to generate a polar polynomial trajectory (PPT), which is continuous in curvature without violating dynamic constraints of the wheelchair. In the autonomous control mode, the synthesis of angle-based potential field (APF) and vision-based simultaneous localization and map-building (SLAM) technique is proposed to guide the robot navigating in environments where obstacles exist. Experimental studies have been carried out with a number of volunteers and the effectiveness of the proposed shared control scheme has been verified.KeywordsShared controlBrain-machine interfacesDynamic constraints

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.