Abstract

Gait recognition is the identification of any person from his/her walking pattern. Walking pattern of each individual is unique and cannot be replicated by others. But, gait recognition is very difficult if any object is carried by any individual. This article proposes a novel computer vision based method of gait recognition both with and without carried objects (COs) using Faster region convolutional neural network (R-CNN) based architecture. To the best of my knowledge, this is the first investigation based on faster R-CNN for gait recognition in the literature. The Faster R-CNN detects and extracts the pedestrian only in all the frames of the video irrespective of the pedestrian is carrying any object or not. Deep convolutional layers have then been used to generate the feature vector from the walking pattern of the pedestrian generated comprising all the frames. The generated feature vectors from various walking patterns have then been studied using two different versions of recurrent neural network (RNN) namely, long–short term memory (LSTM) and bidirectional long–short term memory (BLSTM) to recognize the walking patterns. To the best of my knowledge, the present investigation uses the BLSTM variant of RNN classifier to recognize the walking patterns for the first time in the literature. The performance of the proposed system has been tested on four widely used public datasets—OU-ISIR Large Population Gait database with real-life carried object (OU-LP-Bag), the OU-ISIR Gait database Treadmill dataset B (OUTD-B), OU-ISIR Large Population Gait database with Age (OULP-Age), and CASIA Gait database B (CASIA-B). The experimental results demonstrate that the proposed gait recognition system outperforms the existing state-of-the-art results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.