Abstract

This paper proposes a new gait representation that encodes the dynamics of a gait period through a 2D array of 17-bin histograms. Every histogram models the co-occurrence of optical flow states at every pixel of the normalized template that bounds the silhouette of a target subject. Five flow states (up, down, left, right, null) are considered. The first histogram bin counts the number of frames over the gait period in which the optical flow for the corresponding pixel is null. In turn, each of the remaining 16 bins represents a pair of flow states and counts the number of frames in which the optical flow vector has changed from one state to the other during the gait period. Experimental results show that this representation is significantly more discriminant than previous proposals that only consider the magnitude and instantaneous direction of optical flow, especially as the walking direction gets closer to the viewing direction, which is where state-of-the-art gait recognition methods yield the lowest performance. The dimensionality of that gait representation is reduced through principal component analysis. Finally, gait recognition is performed through supervised classification by means of support vector machines. Experimental results using the public CMU MoBo and AVAMVG datasets show that the proposed approach is advantageous over state-of-the-art gait representation methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call