Abstract
Gait recognition is an attractive human recognition technology. However, existing gait recognition methods mainly focus on the regular gait cycles, which ignore the irregular situation. In real-world surveillance, human gait is almost irregular, which contains arbitrary dynamic characteristics (e.g., duration, speed, and phase) and varied viewpoints. In this paper, we propose the attentive spatial–temporal summary networks to learn salient spatial–temporal and view-independence features for irregular gait recognition. First of all, we design the gate mechanism with attentive spatial–temporal summary to extract the discriminative sequence-level features for representing the periodic motion cues of irregular gait sequences. The designed general attention and residual attention components can concentrate on the discriminative identity-related semantic regions from the spatial feature maps. The proposed attentive temporal summary component can automatically assign adaptive attention to enhance the discriminative gait timesteps and suppress the redundant ones. Furthermore, to improve the accuracy of cross-view gait recognition, we combine the Siamese structure and Null Foley–Sammon transform to obtain the view-invariant gait features from irregular gait sequences. Finally, we quantitatively evaluate the impact of the irregular gait and viewpoint interval between matching pairs on gait recognition accuracy. Experimental results show that our method achieves state-of-the-art performance in irregular gait recognition on the OULP and CASIA-B datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.