Abstract

Automatizing the identification of human brain stimuli during head movements could lead towards a significant step forward for human computer interaction (HCI), with important applications for severely impaired people and for robotics. In this paper, a neural network-based identification technique is presented to recognize, by EEG signals, the participant’s head yaw rotations when they are subjected to visual stimulus. The goal is to identify an input-output function between the brain electrical activity and the head movement triggered by switching on/off a light on the participant’s left/right hand side. This identification process is based on “Levenberg–Marquardt” backpropagation algorithm. The results obtained on ten participants, spanning more than two hours of experiments, show the ability of the proposed approach in identifying the brain electrical stimulus associate with head turning. A first analysis is computed to the EEG signals associated to each experiment for each participant. The accuracy of prediction is demonstrated by a significant correlation between training and test trials of the same file, which, in the best case, reaches value r = 0.98 with MSE = 0.02. In a second analysis, the input output function trained on the EEG signals of one participant is tested on the EEG signals by other participants. In this case, the low correlation coefficient values demonstrated that the classifier performances decreases when it is trained and tested on different subjects.

Highlights

  • In human computer interaction (HCI), design and application of brain–computer interfaces (BCIs) are among the main challenging research activities

  • The brain signals acquisition may be realized by different devices such as Electroencephalography (EEG), Magnetoencephalography (MEG), Electrocorticography (ECoG), or functional near infrared spectroscopy [5]

  • The classification is the central element of the BCI and it refers to the identification of the correct translation algorithm, which converts the extracting signals features into control commands for the devices according to the user’s intention

Read more

Summary

Introduction

In human computer interaction (HCI), design and application of brain–computer interfaces (BCIs) are among the main challenging research activities. BCI technologies aim at converting human mental activities into electrical brain signals, producing a control command feedback to external devices such as robot systems [1]. The essential stages for a BCI application consist of a signal acquisition of the brain activities, on the preprocessing and feature extraction, classification, and feedback. ANN classifier, the scope of the application represents the main novelty due to the fact that we explore the recognition of the yaw head rotations directed toward a light target by EEG. ANN classifier, the scope of the application represents the main novelty due to the fact that we explore the recognition of the yaw head rotations directed toward a light target by EEG brain activities to support the driving of tasks in different applications, such as to control autonomous vehicle the or wheelchair or robot in general.

System Architecture
EEG Enobio Cap
Simulation Description
Pre-Processing Data
Input Output Data Analysis
Data Set
Second
Tables and
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call