Abstract

This study aimed to develop an intuitive gait-related motor imagery (MI)-based hybrid brain-computer interface (BCI) controller for a lower-limb exoskeleton and investigate the feasibility of the controller under a practical scenario including stand-up, gait-forward, and sit-down. A filter bank common spatial pattern (FBCSP) and mutual information-based best individual feature (MIBIF) selection were used in the study to decode MI electroencephalogram (EEG) signals and extract a feature matrix as an input to the support vector machine (SVM) classifier. A successive eye-blink switch was sequentially combined with the EEG decoder in operating the lower-limb exoskeleton. Ten subjects demonstrated more than 80% accuracy in both offline (training) and online. All subjects successfully completed a gait task by wearing the lower-limb exoskeleton through the developed real-time BCI controller. The BCI controller achieved a time ratio of 1.45 compared with a manual smartwatch controller. The developed system can potentially be benefit people with neurological disorders who may have difficulties operating manual control.

Highlights

  • Brain–computer interface (BCI) technology benefits people suffering from neurological disorders on account of its characteristics of various computer-controlled applications using brain signals [1,2].The recent development of a lower-limb exoskeleton is significant, considering the fact it effectively bridges between brain signals and a motor output of extremities to improve the quality of life of the gait disabilities [3,4,5]

  • The spectrogram reveals that the event-related desynchronization (ERD) appeared while subjects are engaging in both Gait motor imagery (MI) and Sit MI, whereas less or no ERD was observed during Do-nothing task

  • We developed an MI-based hybrid BCI controller for the lower-limb exoskeleton operation

Read more

Summary

Introduction

Brain–computer interface (BCI) technology benefits people suffering from neurological disorders on account of its characteristics of various computer-controlled applications using brain signals [1,2].The recent development of a lower-limb exoskeleton is significant, considering the fact it effectively bridges between brain signals and a motor output of extremities to improve the quality of life of the gait disabilities [3,4,5]. Among the various electroencephalogram (EEG) neural features, three distinguishable ones have been adopted notably for decoding lower-limb movement intentions, namely movement-related cortical potential (MRCP), steady-state visual evoked potential (SSVEP), and event-related desynchronization (ERD). In the case of the SSVEP [7], subjects have to continuously focus on a flickering light until the evoked potential exceeds a threshold. Thereby, it is difficult for the exoskeleton drivers to deal with an unexpected outer situation. The ERD is another representative EEG neural feature for the exoskeleton BCI controller, usually induced by motor imagery (MI). The BCI controller can match various commands related to distinctive MI strategies with separable scalp topographic patterns [8]

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call