Abstract

This article presents a rehabilitation technique based on a lower-limb exoskeleton integrated with a human-machine interface (HMI). HMI is used to record and process multimodal signals collected using a foot motor imagery (MI)-based brain-machine interface (BMI) and multichannel electromyographic (EMG) signals recorded from leg muscles. Current solutions of HMI-equipped rehabilitation assistive technologies tested under laboratory conditions demonstrated a great deal of success, but faced several difficulties caused by the limited accuracy of detecting MI electroencephalography (EEG) and the reliability of online control when executing a movement by patients dressed in an exoskeleton. In the case of lower-limb representation, there is still the problem of reliably distinguishing leg movement intentions and differentiating them in BMI systems. Targeting the design of a rehabilitation technique replicating the natural mode of motor control in exoskeleton walking patients, we have shown how the combined use of multimodal signals can improve the accuracy, performance, and reliability of HMI. The system was tested on healthy subjects operating the exoskeleton under different conditions. The study also resulted in algorithms of multimodal HMI data collection, processing, and classification. The developed system can analyze up to 15 signals simultaneously in real-time during a movement. Foot MI is extracted from EEG signals (seven channels) using the event-related (de)synchronization effect. Supplemented by EMG signals reflecting motor intention, the control system can initiate and differentiate the movement of the right and left legs with a high degree of reliability. The classification and control system permits one to work online when the exoskeleton is executing a movement.

Highlights

  • Today, exoskeletons are regarded as a powerful instrument for the clinical rehabilitation of patients with impaired lower-limb function

  • We developed two protocols for combining EEG and EMG: (i) human–machine interface (HMI) based on extracting common spatial pattern filter (CSP) features with subsequent linear discriminant analysis (LDA) classification (Fig. 15a) and (ii) HMI based on separate feature extraction and classification, the results of which were combined by logical operators ‘‘AND’’ and ‘‘OR’’ (Fig. 15b)

  • In our study, we simulated realistic conditions for subjects controlling an exoskeleton integrated with multimodal HMI

Read more

Summary

Introduction

Exoskeletons are regarded as a powerful instrument for the clinical rehabilitation of patients with impaired lower-limb function (for a recent review, see Refs. [1], [2]). Exoskeletons are regarded as a powerful instrument for the clinical rehabilitation of patients with impaired lower-limb function The associate editor coordinating the review of this manuscript and approving it for publication was Qichun Zhang. A more natural and intuitive way of interacting with exoskeletons and other neuroprosthetic devices is to use endogenous brain signals. This can be implemented using brain–machine interface (BMI) systems based on electroencephalographic (EEG) signals generated independently from external stimulation. This approach allows the movement to be fully controlled by a subject. A typical example of such a BMI is a system based on sensorimotor rhythms such as motor

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call