Abstract
This paper proposes a novel 3D action recognition technique that uses time-series information extracted from depth image sequences for use in systems of human daily activity monitoring. To this end, each action is represented as a multi-dimensional time series, where each dimension represents the position variation of one skeleton joint over time. The time series is then mapped onto a vector space using Dynamic Time Warping (DTW) distance. Furthermore, to employ the correlation-distinctiveness relationship of the sequences in recognition, this vector space is remapped onto a discriminative space using the regularized Fisher method, where final decisions about the actions are made. Unlike other available methods, the time-warping used in the mapping strategy makes the feature space robust to temporal variations of the motion sequences. Moreover, our method eliminates the need for a complicated design method for extracting the static and dynamic information of a motion sequence. Furthermore, most existing methods treat all skeletal joints identically for different actions, while some joints are more discriminative to distinguish a specific action. Thanks to the nature of the proposed features, we propose to use a separate set of discriminative joints, called joint importance map for each class of action. Evaluation results on four well-known datasets, TST, UTKinect, UCFKinect, and NTU RGB+D show competitive performance with the state-of-the-art methods in human action recognition.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.