Abstract

In the recent years, the emotion recognition task has attracted a lot of attention in the field of human–computer interaction. Most existing research typically uses aural-visual analysis, which has been an effective approach to capturing emotional features. However, aural-visual signals are difficult to notice, in comparison to other human representations such as gait in remote situations. Recently, human gait can be effectively recognized in more complex backgrounds because of the advancement of Graph Convolutional Networks (GCNs). According to the anatomy of the human body, central torso joints play a key role in GCNs-based human gait recognition systems, instead of the body’s marginal limb joints. As a result, there is a major issue of receptive field imbalance. In this study, we propose a method for perceiving emotions based on the human gait skeleton. We present a multi-head pseudo nodes strategy to alleviate the receptive field imbalance problem and capture the non-local dependencies among different joints. The strategy employs a series of extra nodes that link to all physical human body joints and obtain global information from different feature spaces. The results of the experiments on a public emotion-gait dataset demonstrate that our proposed method outperforms existing skeleton-based methods. Further, to verify the effectiveness of our method, we use publicly available human action recognition datasets. Our results show that our method significantly improves performance in comparison to other baseline methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.