Abstract

Advances in the field of Brain-Computer Interfaces (BCIs) aim, among other applications, to improve the movement capacities of people suffering from the loss of motor skills. The main challenge in this area is to achieve real-time and accurate bio-signal processing for pattern recognition, especially in Motor Imagery (MI). The significant interaction between brain signals and controllable machines requires instantaneous brain data decoding. In this study, an embedded BCI system based on fist MI signals is developed. It uses an Emotiv EPOC+ Brainwear®, an Altera SoCKit® development board, and a hexapod robot for testing locomotion imagery commands. The system is tested to detect the imagined movements of closing and opening the left and right hand to control the robot locomotion. Electroencephalogram (EEG) signals associated with the motion tasks are sensed on the human sensorimotor cortex. Next, the SoCKit processes the data to identify the commands allowing the controlled robot locomotion. The classification of MI-EEG signals from the F3, F4, FC5, and FC6 sensors is performed using a hybrid architecture of Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. This method takes advantage of the deep learning recognition model to develop a real-time embedded BCI system, where signal processing must be seamless and precise. The proposed method is evaluated using k-fold cross-validation on both created and public Scientific-Data datasets. Our dataset is comprised of 2400 trials obtained from four test subjects, lasting three seconds of closing and opening fist movement imagination. The recognition tasks reach 84.69% and 79.2% accuracy using our data and a state-of-the-art dataset, respectively. Numerical results support that the motor imagery EEG signals can be successfully applied in BCI systems to control mobile robots and related applications such as intelligent vehicles.

Highlights

  • Introduction affiliationsIn the last decade, various practical applications have been developed using the electrical signals generated in the brain through Brain-Computer Interfaces (BCIs) [1,2]

  • Motor Imagery (MI)-EEG recognition was achieved by implementing a Convolutional Neural Networks (CNNs)-Long Short-Term Memory (LSTM) architecture and creating, training, and validating an EEG dataset

  • This study presents the development of an embedded BCI system based on EEG

Read more

Summary

Introduction

Various practical applications have been developed using the electrical signals generated in the brain through Brain-Computer Interfaces (BCIs) [1,2]. The Electroencephalograms (EEGs) are examples of such signals used currently to control specific devices such as service robots, motorized wheelchairs, drones, and several other human support machines. EEG signals have found many applications in health sciences to detect mental diseases. A multi-modal machine learning approach was used to detect dementia integrating EEG-engineered features [3]. Some neurodegenerative disorders of the brain such as Alzheimer’s disease have been studied using EEG signals to improve disease detection [4]. Detection of schizophrenia risks in children between nine and 13 years was developed using EEG patterns classified by traditional machine learning algorithms [5]

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.