Abstract
How to efficiently tag the emotional experience of multimedia contents is an important and challenging problem in the field of affective computing. This paper presents an EEG-based real-time emotion tagging approach, by extracting inter-brain features from a group of participants when they watch the same emotional video clips. First, the continuous subjective reports on both the arousal and valence dimensions of emotion were obtained by employing a three-round behavioral rating paradigm. Second, the inter-brain features were systematically explored in both spectral and temporal domain. Finally, regression analyses were performed to evaluate the effectiveness of inter-brain amplitude and phase features. The inter-brain amplitude feature showed significantly better prediction performance than the inter-brain phase feature, as well as another two conventional features (spectral power and inter-subject correlation). By combining the four types of features, regression values (R2) were obtained for the prediction of arousal (0.61 + 0.01) and valence (0.70 + 0.01), corresponding to prediction errors of 1.01 + 0.02 and 0.78 + 0.02 (unit on 9-point scales), respectively. The contributions of different electrodes and frequency bands were also analyzed. Our results show promising potentials of inter-brain EEG features in real-time emotion tagging applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.