Abstract

Understanding students’ emotional states during the learning process is one of the important aspects to improve learning quality. Measurements of emotion in an academic setting can be performed manually or automatically using a computer. However, developing an emotion recognition method using an imaging modality that is contactless, harmless, and illumination-independent is challenging. Thermography, as a non-invasive emotion recognition method, can recognize emotion variance during learning by observing the temperature distributions in a facial region. Deep learning models, such as convolutional neural networks (CNNs), can be used to interpret thermograms. CNNs can automatically classify emotion thermograms into several emotional states, such as happiness, anger, sadness, and fear. Despite their promising ability, CNNs have not been widely used in emotion recognition. In this study, we aimed to summarize the previous works and progress in emotion recognition in academic settings based on thermography and CNN. We first discussed the previous works on emotion recognition to provide an overview of the availability of modalities with their advantages and disadvantages. We also discussed emotion thermography potential for the academic context to find if there is any information in the available emotion thermal datasets related to the subjects’ educational backgrounds. Emotion classification using the proposed CNN model was described step by step, including the feature learning illustration. Lastly, we proposed future research directions for developing a representative dataset in the academic settings, fed the segmented image, assigned a good kernel, and built a CNN model to improve the recognition performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call