Abstract

This paper describes the design of ensemble deep models based on time-scale transformation from electrocardiogram (ECG) signals for emotion recognition. As the number of senior citizens living alone increases, emotion robots that can interact with them and other emotion robots are becoming increasingly important. Existing emotion robots usually recognized emotion through images of the user’s facial expressions or voice signals. However, there are many situations where the user’s emotions cannot read under various environments. Therefore, research on recognizing the user’s emotions through ECG signals among different biomedical signals is actively being conducted. The proposed method converts ECG signals into various types of two-dimensional time-scale representations. We then designed a four-stream deep learning model by applying it to an ensemble form and transfer learning. Finally, an experiment was conducted using the ASCERTAIN sentiment database. This database contains data recorded by 58 people with 9 different emotions. Among these emotions, we used six representatives (surprise, happiness, anger, disgust, fear, and sadness). The experimental results revealed that the presented ensemble deep models showed good performance in comparison with each single deep model and the original model without transformation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.