Abstract

In this paper, we present a method to improve emotion recognition based on the fusion of local cortical activations and dynamic functional network patterns. We estimate the cortical activations using power spectral density (PSD) with the Burg autoregressive model. On the other hand, we estimate the functional connectivity networks by utilizing the phase locking value (PLV). The results of cortical activations and connectivity networks show different patterns across three emotions at all frequency bands. Similarly, the results of fusion significantly improve the classification rate in terms of accuracy, sensitivity, specificity and the area under the receiver operator characteristics curve (AROC), p <; 0.05. The average improvement with fusion in all evaluation metrics are 6.84% and 4.1% when compared to PSD and PLV alone, respectively. The results clearly demonstrate the advantage of fusion of cortical activations with dynamic functional networks for developing human-computer interaction system in real-world applications.

Highlights

  • Emotion recognition facilitates the interaction between humans and intelligent machines

  • RESULTS we present the results of emotion classification based on cortical activations, connectivity network patterns

  • We presented the results of emotion at all frequency bands, only the highly associated frequency bands with emotions used for classification evaluations

Read more

Summary

Introduction

Emotion recognition facilitates the interaction between humans and intelligent machines. Emotion recognition is a critical factor for several domains such as human robot interaction, characterizing the level of interest on learning, identifying the level of vigilance in road and safety, detecting patient’s mental and physical states and the progress in recovery [1]–[3]. Different approaches have been considered to measure emotions. These include the methods based on speech, facial expressions, physiological measurements and selfassessment [3]–[6]. Social expectations may bias self-assessment of emotions, speech, and facial expressions. Subjects may conceal their feelings and influence these

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call