Abstract

Gait recognition is the identification of individuals based on how they walk. It can identify an individual of interest without their intervention, making it better suited for surveillance from afar. Computer-aided silhouette-based gait analysis is frequently employed due to its efficiency and effectiveness. However, covariate conditions have a significant influence on individual recognition because they conceal essential features that are helpful in recognizing individuals from their walking style. To address such issues, we proposed a novel deep-learning framework to tackle covariate conditions in gait by proposing regions subject to covariate conditions. The features extracted from those regions will be neglected to keep the model's performance effective with custom kernels. The proposed technique sets aside static and dynamic areas of interest, where static areas contain covariates, and then features are learnt from the dynamic regions unaffected by covariates to effectively recognize individuals. The features were extracted using three customized kernels, and the results were concatenated to produce a fused feature map. Afterward, CNN learns and extracts the features from the proposed regions to recognize an individual. The suggested approach is an end-to-end system that eliminates the requirement for manual region proposal and feature extraction, which would improve gait-based identification of individuals in real-world scenarios. The experimentation is performed on publicly available dataset i.e. CASIA A, and CASIA C. The findings indicate that subjects wearing bags produced 90 % accuracy, and subjects wearing coats produced 58 % accuracy. Likewise, recognizing individuals with different walking speeds also exhibited excellent results, with an accuracy of 94 % for fast and 96 % for slow-paced walk patterns, which shows improvement compared to previous deep learning methods.© 2017 Elsevier Inc. All rights reserved.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.