Abstract

Electroencephalography (EEG) and Electromyography (EMG) signals are playing significant role in controlling bio-robotics applications, such as prostheses. Brain Computer Interfaces (BCIs) for amputees allow them to use their remaining functionalities as control possibilities by turning brain signals into commands for external devices. However, BCIs that use the EEG signals alone are not yet fully acceptable in bio-robotic applications. Myoelectric control systems use EMG signals recorded on the residual muscles of amputated limbs to control prosthesis, but the residual muscles cannot often provide enough signal to control a multiple degrees of freedom prosthetics. This paper introduces a new hybrid BCI model that integrates EEG and EMG signal processing with machine learning models to efficiently improve classification accuracy and increase the control performance of different upper limb movements for above elbow amputees. Experiments have been carried out on a large dataset of 64 channel EEG signals, combined with 32 channel surface EMG (sEMG) signals, acquired simultaneously from above-elbow amputees to decode five hand and wrist motions. Classification accuracy is used for evaluating the classification performance with 5-fold cross validation. Experimental results demonstrate that the proposed model achieves a high classification accuracy, exceeding 98.8%, using 6th order Autoregressive (AR) model coefficients with three proposed combined set of features from time domain, frequency domain and entropies. This suggested model outperforms previously proposed models in the literature with up to 8.9% improvement ratio; this implies that Hybrid-BCI can be reliably used to improve the control performance of upper limb movements. Furthermore, the model may be easily applied into real-time hybrid BCI application.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call