Abstract
The emotional state of a person is important to understand their affective state. Affective states are an important aspect of our being “human”. Therefore, for man-machine interaction to be natural and for machines to understand people, it is becoming necessary to understand a person’s emotional state. Non-verbal behavioral cues such as facial expression and hand gestures provide a firm basis for understanding the affective state of a person. In this paper, we proposed a novel, real-time framework that focuses on extracting the dynamic information from a videos for multiple modalities to recognize a person’s affective state. In the first step, we detect the face and hands of the person in the video and create the motion history images (MHI) of both the face and gesturing hands to encode the temporal dynamics of both these modalities. In the second step, features are extracted for both face and hand MHIs using deep residual network ResNet-101 and concatenated into one feature vector for recognition. We use these integrated features to create subspaces that lie on a Grassmann manifold. Then, we use Geodesic Flow Kernel (GFK) of this Grassmann manifold for domain adaptation and apply this GFK to adapt GGDA to robustly recognize a person’s affective state from multiple modalities. An accuracy of 93.4% on FABO (Gunes and Piccardi 19) dataset and 92.7% on our own dataset shows that integrated face and hand modalities perform better than state-of-the-art methods for affective state recognition.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.