Abstract

Perceiving human emotions is crucial in the realm of affective computing. As a nonverbal biological feature, gait plays a significant role in this field, owing to its resistance to manipulation or replication. In this paper, we propose a gait-based emotion perception framework called Dependency-Difference Gait (DDG), which can extract emotional features from gait patterns comprehensively and efficiently. We also introduce a method of spatial–temporal difference representation, which constructs the static spatial difference information within frames and dynamic temporal difference information between frames. We abstract these details as difference information and fuse them with the dependency information extracted from the original sequence. Our approach not only breaks the limitations of hand-crafted features, but also enables the extraction of a broader spectrum of emotional features. Additionally, we present the Emotional Information Attention (EIA) mechanism, allowing DDG to focus on key joints and frames based on the quantity of emotional information. Experimental and visualization results substantiate the effectiveness of the DDG and EIA. In the quality analysis, we find that selecting a few number of joints with a substantial amount of emotional information is beneficial for emotion classification. However, selecting a few frames can disrupt the temporal structure of the sequence, resulting in suboptimal performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call