Abstract
We consider the problem of person identification using gait sequences under normal, carrying bag and different clothing conditions as the main concern. It has been demonstrated that Gait Energy Image (GEI) can attain a better gait recognition rate under normal conditions. However, it has been shown that GEI is not robust enough to handle the carrying bags and different clothing conditions. Instead of GEI, there are several appearance based gait features in the available literature to reduce the effect of covariate factors by keeping dynamic parts and removing the static parts of the gait features under the assumption that the carrying bags and different clothing conditions affect mostly the static parts. It is however shown in the literature that the static parts also contain valuable information and removal of certain static parts such as head by mistake thigh typed certainly decreases the recognition rate.Our main objective has been to increase the gait recognition rate on different clothing and carrying bag covariate gait sequences. Therefore instead of removing static parts, the Joint Sparsity Model (JSM) is applied to identify the carrying bags and different clothings conditions from GEI features. If a set of GEI feature vectors is submitted to JSM model then a common component and an innovations component for each GEI feature are obtained. The innovations component that has unique characteristic to each of features is considered to identify the covariate conditions. The identified covariate conditions are removed from GEI features and a novel gait feature called GEIJSM is generated. The dimension of GEIJSM is reduced using Random Projection (RP) approach and ℓ1-norm minimization technique based sparse representation is used for classification. It is demonstrated that the RP and ℓ1-norm minimization based sparse representation approach provides statistically significant better results than that of the existing individual identification approaches.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.