Abstract

In recent years, the focus of human emotion analysis has gradually shifted towards not only using visible information, but also thermal infrared (IR) information. This requires a great deal of facial emotion data both in visible and thermal IR information. However, most existing databases contain either visible information or posed thermal IR information only. For these reasons, we propose and establish a multimodal facial emotion database including both natural spontaneous visible and thermal IR videos. Beside updating more thermal infrared information, the built dataset in this study also enhances the information of intensity emotions. In which, each emotion is classified into three levels (low, medium, and high). Seven spontaneous emotions from thirty subjects are recorded in the database. Audio and visual stimuli were used to elicit emotions during the experiment. After the standard procedure of collecting data finished, the database has been through the careful annotation and verification procedure. Furthermore, the built database is analyzed by using modern machine learning models such as CNN, ResNet50, YOLO, and using a combination of different models to analyze the dataset. The obtained results are feasible and show that this dataset is useful for use in practice. The results of thermal data analysis provide us with a promising idea for future research on estimating human emotion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call