Abstract

In the gait recognition task, cloth-changing data (e.g., CL-class data in the CASIA-B dataset) account for a very low proportion of the gait dataset, which brings serious recognition accuracy degradation to gait recognition algorithm in clothes changing scenarios. One way to solve this problem is to use data generation algorithms to expand the cloth-changing data. When facing the generation of minority-class data, the existing GAN-based image data expansion algorithms require a large amount of data to ensure the convergence, while mode collapse can occur due to the conflict between the true-false discriminant loss and the classification loss. In order to solve the above problems, we propose an autoencoder-guided GAN for cloth-changing gait data generation, which uses the autoencoder with embedding layer to extract the data features of the entire data set, rather than only extracting the features of the required CL data. Then the labeled latent vector (representing the extracted data features) in the autoencoder is taken as the input of the GAN, solving the problem that GAN convergence requires a lot of training data. The GAN has the same topology as the autoencoder and uses the parameter matrix of the autoencoder as the initial weights. Secondly, the true classification in GAN is canceled, and only the false classification is retained, which solves the problem of the collision of the loss function between the true-false discrimination and the classification discrimination. In the cloth-changing data generation experiments, compared with other GAN data expansion algorithms, our method generates clothes changing data with high quality, and ensures the diversity of the generated data. When using the CASIA-B dataset expanded by our algorithm, several gait recognition algorithms all show significant improvement, especially in the clothes changing scenario.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.