Abstract

Gait recognition invariant to carried objects (COs) is very difficult in a real-life scene because the COs can have various shapes and sizes, in addition to unpredictable carrying locations (e.g., front, back, and side, or multiple locations). Therefore, in this paper, we propose a robust method for gait recognition against various COs by reconstructing a gait template without COs. A straightforward approach is to directly generate a gait template without COs given a gait template with COs as the input using a conventional generative adversarial network. There is, however, a potential risk of unnecessarily altering parts that were originally unaffected by COs (e.g., leg parts for a person carrying a backpack). Because we do not want to touch such unaffected parts in the original template, we first estimate a gait template without COs, and then blend it with the original template by an estimated alpha matte that indicates the blending parameters. We then create an alpha-blended template from the original template and the generated template without COs based on the estimated alpha matte. We use two independent generators to estimate the alpha matte and the generated template without COs. Finally, we feed the alpha-blended gait template into a state-of-the-art discrimination network for gait recognition. The experimental results on three publicly available gait databases with real-life COs demonstrate the state-of-the-art performance of the proposed method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.