Abstract

Abstract Automatic human activity recognition is being studied widely by researchers for various applications. However, majority of the existing work are limited to recognition of isolated activities, though human activities are inherently continuous in nature with spatial and temporal transitions between various segments. Therefore, there are scopes to develop a robust and continuous Human Activity Recognition (HAR) system. In this paper, we present a novel Coarse-to-Fine framework for continuous HAR using Microsoft Kinect. The activity sequences are captured in the form of 3D skeleton trajectories consisting of 3D positions of 20 joints estimated from the depth data. The recorded sequences are first coarsely grouped into two activity sequences performed during sitting and standing. Next, the activities present in the segmented sequences are recognized into fine-level activities. Activity classification in both stages are performed using Bidirectional Long Short-Term Memory Neural Network (BLSTM-NN) classifier. A total of 1110 continuous activity sequences have been recorded using a combination of 24 isolated human activities. Recognition rates of 68.9% and 64.45% have been recorded using BLSTM-NN classifier when tested using length-modeling and without length-modeling, respectively. We have also computed results for isolated activity recognition performance. Finally, the performance has been compared with existing approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.