Abstract

This paper presents a non-contact system based on twin channels of thermal and visual image sequences to register the affective states of an individual during Human-Computer Interaction (HCI). The negative affective states such as stress, anxiety, and depression in students have raised significant concerns. This necessitates a smart HCI system as an assisting tool for psychologists. In this paper, we propose a two-stage smart system for classifying the affective state by clustering of sequences of emotional states. The first stage obtains the dominant emotional state by an ensemble of cues from visual and thermal facial images using a newly proposed cascaded Convolutional Neural Network (CNN) model. We have named this 16-layered network as the EmoScale, as it classifies the dominant emotional state of an individual. The second stage clusters a sequence of the obtained emotional states using a trained Hidden Markov model (HMM) as one of the four dominant affect stress, depression, anxiety, or healthy. We perform five-fold cross-validation of EmoScale on our self prepared data-set as well as the hetero- face database. The performance of the second stage has compared with a standard Depression Anxiety Stress Scale (DASS) with 51 subjects, and the results are found to be promising.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call