Abstract
With the pervasive growth of IoT, Wi-Fi CSI-based human gait recognition faces the challenge of maintaining model robustness in new environments. Current cross-domain methods often rely on symmetric data, restricting their practical utility. To overcome this limitation, we propose MaP-SGAN, an innovative model specifically designed to enhance gait recognition’s robustness in cross-domain scenarios. MaP-SGAN integrates three crucial components: (1) the Consistency Maintenance Module, utilizing a Siamese network to ensure latent vector consistency within the asymmetric data domain; (2) the Diversity Enhancement Module, incorporating static environmental information to diversify the generator and improve the alignment of generated samples with the target domain’s characteristics; (3) the Multi-anchor Point Metrics in the discriminator, employing multi-anchor point metrics to constrain the similarity between generated data and target domain features. This metric approach comprehensively addresses the distributional differences between the source and target domains, thereby enhancing the model’s discriminative capability and generalization performance. Through extensive testing in diverse scenarios involving different clothing and environments, our proposed MaP-SGAN model achieves an average accuracy of 87.58% in cross-clothing recognition, which is an improvement of 4.03% compared to existing state-of-the-art models. The average accuracy in cross-environment recognition is 53.8%, which is an improvement of about 6.34% compared to existing state-of-the-art models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.