Abstract

Human gait is an emerging biometric feature and contains important information for long-distance emotion recognition. However, sadness and neutral emotions are easily misjudged during the recognition process, because the body posture of the two emotions is quite similar. Existing methods have difficulty in distinguishing these two emotions satisfactorily since they treat all action features equally without differentiating their contribution to recognition, or they ignore the motion information in gait that can express emotion. In this paper, we propose a novel hierarchical attention neural network, which can automatically learn the affective features contained in human motion and action, and effectively distinguish between sad and neutral emotions. The network consists of three modules: motion sentiment module (MSM), action sentiment module (ASM) and emotion classifier. Specifically, MSM is composed of a position encoder and a velocity encoder. It extracts affective features from motion information and helps to distinguish the sad and neutral emotions based on gait velocity. ASM consists of an action encoder that compresses the discriminative human actions into a latent space. Emotion classifier recognizes the emotion based on the outputs from MSM and ASM. Moreover, we present a feature preprocessing method to deal with the problem of imbalanced data categories. Experiments demonstrate that our approach enhances the discriminability of sad and neutral emotions and performs better than many state-of-the-art methods. In addition, ablation experiments further verify that both velocity and action features are important for the gait-based emotion recognition task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call