Abstract
Gait recognition aims to recognize persons' identities by walking styles. Gait recognition has unique advantages due to its characteristics of non-contact and long-distance compared with face and fingerprint recognition. Cross-view gait recognition is a challenge task because view variance may produce large impact on gait silhouettes. The development of deep learning has promoted cross-view gait recognition performances to a higher level. However, performances of existing deep learning-based cross-view gait recognition methods are limited by lack of gait samples under different views. In this paper, we take a Multi-view Gait Generative Adversarial Network (MvGGAN) to generate fake gait samples to extend existing gait datasets, which provides adequate gait samples for deep learning-based cross-view gait recognition methods. The proposed MvGGAN method trains a single generator for all view pairs involved in single or multiple datasets. Moreover, we perform domain alignment based on projected maximum mean discrepancy to reduce the influence of distribution divergence caused by sample generation. The experimental results on CASIA-B and OUMVLP dataset demonstrate that fake gait samples generated by the proposed MvGGAN method can improve performances of existing state-of-the-art cross-view gait recognition methods obviously on both single-dataset and cross-dataset evaluation settings.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.