Abstract

Abstract The classification of human body motion is an integral component for the automatic interpretation of video sequences. In a first part we present an effective approach that uses mixed discrete/continuous states to couple perception with classification. A spline contour is used to track the outline of the person. We show that for a quasi-periodic human body motion, an autoregressive process (ARP) is a suitable model for the contour dynamics. A collection of ARP can then be used as a dynamical model for mixed state Condensation filtering, switching automatically between different motion classes. Subsequently this method is applied to automatically segment sequences which contain different motions into subsequences, which contain only one type of motion. Tracking the contour of moving people is, however, difficult. This is why we propose to classify the type of motion directly from the spatio-temporal features of the image sequence. Representing the image data as a spatio-temporal or XYT cube and taking the ‘epipolar slices’ [Workshop on Computer Vision, Representation and Control, Shanty Creek, MI, October (1985) 168] of the cube reveals that different motions, such as running and walking, have characteristic patterns. A new method, which effectively compresses these motion patterns into a low-dimensional feature vector is introduced. The convincing performance of this new feature extraction method is demonstrated for both the classification and automatic segmentation of video sequences for a diverse set of motions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call