Abstract

With the continuous development of portable noninvasive human sensor technologies such as brain–computer interfaces (BCI), multimodal emotion recognition has attracted increasing attention in the area of affective computing. This paper primarily discusses the progress of research into multimodal emotion recognition based on BCI and reviews three types of multimodal affective BCI (aBCI): aBCI based on a combination of behavior and brain signals, aBCI based on various hybrid neurophysiology modalities and aBCI based on heterogeneous sensory stimuli. For each type of aBCI, we further review several representative multimodal aBCI systems, including their design principles, paradigms, algorithms, experimental results and corresponding advantages. Finally, we identify several important issues and research directions for multimodal emotion recognition based on BCI.

Highlights

  • Emotion is a general term for a series of subjective cognitive experiences

  • We find that emotion-recognition models based on multimodal brain–computer interfaces (BCI) achieve better performances than do models based on single-modality BCI; we identify the current challenges in the academic research and engineering application of emotion recognition and provide some potential solutions

  • Among the many brain measurement techniques (including EEG and functional magnetic resonance imaging), we believe that multimodal affective BCI (aBCI) systems based on EEG signals can greatly improve the results of emotion recognition [14,15]

Read more

Summary

Introduction

Emotion is a general term for a series of subjective cognitive experiences. Emotions consist of a set of psychological states generated by various feelings, thoughts and behaviors. Affective BCI (aBCI) [2] originated from a research project in the general communication field that attempted to create neurophysiological devices to detect emotional state signals and to use the detected information to promote human-computer interaction. There are few comprehensive summaries and discussions of multimodal EEG-based emotion recognition systems With this problem in mind, this paper presents the concepts and applications of multimodal emotion recognition to the aBCI community. Three main types of multimodal aBCI were devised: aBCI based on a combination of behavior and brain signals, aBCI based on various hybrid neurophysiology modalities and aBCI based on heterogeneous sensory stimuli; For each type of aBCI, we have reviewed several representative multimodal aBCI systems and analyze the main components of each system, including design principles, stimuli paradigms, fusion methods, experimental results and relative advantages.

Multimodal Affective BCI
Multimodal Fusion Method
The Multimodal Open Database and Its Research on Representativeness
Combination of Behavior and Brain Signals
EEG and Eye Movement
EEG and Facial Expressions
Various Hybrid Neurophysiology Modalities
EEG and Peripheral Physiology
EEG and Other Neuroimaging Modality
Heterogeneous Sensory Stimuli
Audio-Visual Emotion Recognition
Visual-Olfactory Emotion Recognition
Open Challenges and Opportunities
Paradigm Design
Stimulus-Independent and Passive Paradigms
Stimulus-Independent and Active Paradigms
Stimulus-Dependent and Passive Paradigms
Stimulus-Dependent and Active Paradigms
Modality Measurement
EEG Noise Reduction and Artifact Removal
EEG Bands and Channels Selecting
Feature Optimization
Generalization of Model
Medical Applications
Non-Medical Applications
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call