Abstract
Gait recognition is a biometric identification method that can be realized under long-distance and no-contact conditions. Its applications in criminal investigations and security inspections are thus broad. Most existing gait recognition methods adopted the gait energy image (GEI) for feature extraction. However, the GEI method ignores the dynamic information of gait, which causes the recognition performance to be greatly affected by viewing angle changes and the subject’s belongings and clothes. To solve these problems, in this paper a cross-view gait recognition method that uses a dual-stream network based on the fusion of dynamic and static features (FDSN) is proposed. First, the static features are extracted from the GEI and the dynamic features are extracted from the image sequence of the human’s lower limbs. Then, the two features are fused, and finally, a nearest neighbor classifier is used for classification. Comparative experiments on the CASIA-B dataset created by the Automation Institute of the Chinese Academy of Sciences showed that the FDSN achieves a higher recognition rate than a convolutional neural network (CNN) and Gaitset under changes in viewing angle or clothing. To meet our requirements, in this study a gait image dataset was collected and produced in a campus setting. The experimental results on this dataset show the effectiveness of the FDSN in terms of eliminating the effects of disruptive changes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Advanced Computational Intelligence and Intelligent Informatics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.