Abstract
The purpose of this study is to improve human emotional classification accuracy using a convolution neural networks (CNN) model and to suggest an overall method to classify emotion based on multimodal data. We improved classification performance by combining electroencephalogram (EEG) and galvanic skin response (GSR) signals. GSR signals are preprocessed using by the zero-crossing rate. Sufficient EEG feature extraction can be obtained through CNN. Therefore, we propose a suitable CNN model for feature extraction by tuning hyper parameters in convolution filters. The EEG signal is preprocessed prior to convolution by a wavelet transform while considering time and frequency simultaneously. We use a database for emotion analysis using the physiological signals open dataset to verify the proposed process, achieving 73.4% accuracy, showing significant performance improvement over the current best practice models.
Highlights
Multimodal human and computer interaction (HCI) has been actively researched over the last few years
Previous studies have shown that changes in skin signals (i.e., galvanic skin response (GSR)) are closely related to changes in peripheral nerves with emotional changes [2], and electroencephalogram (EEG) signals from the frontal lobe are strongly related to emotional changes [3,4]
We used two class labels that were commonly adopted in previously studies to compare performance, as measured by arousal and valence classification accuracy for the Database for Emotion Analysis using Physiological signals (DEAP) dataset
Summary
Multimodal human and computer interaction (HCI) has been actively researched over the last few years. Most previous research has classified emotions using only facial expressions. Facial expressions only represent part of the overall human emotional response, and emotion discriminators can sometimes make significant mistakes. Biological signals from the central (CNS) and the peripheral (PNS) nervous systems are hard for humans to mentally control, and can accurately represent emotions. Previous studies have shown that changes in skin signals (i.e., galvanic skin response (GSR)) are closely related to changes in peripheral nerves with emotional changes [2], and electroencephalogram (EEG) signals from the frontal lobe are strongly related to emotional changes [3,4]. The current study classified emotions using biological signals, including EEG and GSR
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.