Abstract

Most existing brain-computer Interfaces (BCIs) are designed to control a single assistive device, such as a wheelchair, a robotic arm or a prosthetic limb. However, many daily tasks require combined functions which can only be realized by integrating multiple robotic devices. Such integration raises the requirement of the control accuracy and is more challenging to achieve a reliable control compared with the single device case. In this study, we propose a novel hybrid BCI with high accuracy based on electroencephalogram (EEG) and electrooculogram (EOG) to control an integrated wheelchair robotic arm system. The user turns the wheelchair left/right by performing left/right hand motor imagery (MI), and generates other commands for the wheelchair and the robotic arm by performing eye blinks and eyebrow raising movements. Twenty-two subjects participated in a MI training session and five of them completed a mobile self-drinking experiment, which was designed purposely with high accuracy requirements. The results demonstrated that the proposed hBCI could provide satisfied control accuracy for a system that consists of multiple robotic devices, and showed the potential of BCI-controlled systems to be applied in complex daily tasks.

Highlights

  • An Electroencephalogram (EEG)-based brain-computer interface (BCI) records electrical signals of brain cells from scalp and translates them into various communication or control commands (Wolpaw et al, 2000)

  • Considering the high control precisions required in this study, we set 80% as a minimum passing accuracy to invite potential subjects to participate in more motor imagery (MI) training sessions

  • The false positive rate (FPR) was evaluated without the verification process, aiming to verify the effectiveness of the proposed method based on the peak amplitude and timing

Read more

Summary

Introduction

An Electroencephalogram (EEG)-based brain-computer interface (BCI) records electrical signals of brain cells from scalp and translates them into various communication or control commands (Wolpaw et al, 2000). Common modalities used in EEG-based BCIs include steady-state visual evoked potentials (SSVEP) (Cheng et al, 2015), event-related potentials (ERPs) (Blankertz et al, 2011; Jin et al, 2017), and mu (8–12 Hz)/beta (18–26 Hz) rhythms related to motor imagery (MI) (Lafleur et al, 2013). BCIs only provide discrete commands, MI-based ones can generate nearly continuous outputs in real time, which makes them a good fit for manipulating assistive devices that require highly accurate and continuous control. Several purely MI-based BCIs have been developed to realize basic control of external devices (Wolpaw and McFarland, 2004; Lafleur et al, 2013; Meng et al, 2016). MI-based BCIs still suffer from limited number of distinguishable MI tasks (Yu et al, 2015)

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.