Abstract

Emotion recognition is a burgeoning field allowing for more natural human-machine interactions and interfaces. Electroencephalography (EEG) has shown to be a useful modality with which user emotional states can be measured and monitored, particularly primitives such as valence and arousal. In this paper, we propose the use of ordinal pattern analysis, also called motifs, for improved EEG-based emotion recognition. Motifs capture recurring structures in time series and are inherently robust to noise, thus are well suited for the task at hand. Several connectivity, asymmetry, and graph-theoretic features are proposed and extracted from the motifs to be used for affective state recognition. Experiments with a widely used public database are conducted, and results show the proposed features outperforming benchmark spectrum-based features, as well as other more recent nonmotif-based graph-theoretic features and amplitude modulation-based connectivity/asymmetry measures. Feature and score-level fusion suggest complementarity between the proposed and benchmark spectrum-based measures. When combined, the fused models can provide up to 9% improvement relative to benchmark features alone and up to 16% to nonmotif-based graph-theoretic features.

Highlights

  • Human-machine interaction can become more natural once machines become aware of their surroundings and their users [1, 2]. ese so-called context-aware or affective interfaces can open up new dimensions of device functionality, more accurately addressing human needs while keeping the interfaces as natural as possible [3]

  • Feature selection was implemented in the benchmark features alone, proposed motif feature alone, and in the combined benchmark-motif set. e optimal Balanced accuracy (BACC) values obtained are shown in Tables 2–4, respectively, along with the final number of features used in the models

  • For ANOVA-based feature selection, fewer than 10 features were used in the models for both valence and arousal dimensions with the benchmark features, representing roughly one-sixth of the total amount of available features

Read more

Summary

Introduction

Human-machine interaction can become more natural once machines become aware of their surroundings and their users [1, 2]. ese so-called context-aware or affective interfaces can open up new dimensions of device functionality, more accurately addressing human needs while keeping the interfaces as natural as possible [3]. Human-machine interaction can become more natural once machines become aware of their surroundings and their users [1, 2]. Ese so-called context-aware or affective interfaces can open up new dimensions of device functionality, more accurately addressing human needs while keeping the interfaces as natural as possible [3]. Affective computing can enable applications in which the machine can learn user preferences based on their reactions to different settings or even become a more effective tutor by assessing the student’s emotional/stress states [3]. Human emotions are usually conceived as physiological and physical responses and are part of natural humanhuman communications. Emotions have been known to effect neurophysiological signals; biosignal monitoring has been extensively explored. Representative physiological signal modalities have included galvanic skin response (GSR), skin temperature, and breathing and cardiac activity (via electrocardiography (ECG) and photoplethysmography (PPG)) [15,16,17,18]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call