Abstract

Understanding video-based activities have remained the challenge regardless of efforts from the image processing and artificial intelligence community. However, the rapid developing of computer vision in 3D area has brought an opportunity for the human pose estimation and so far for the activity recognition. In this research, the authors suggest an impressive approach for understanding daily life activities in the indoor using the skeleton information collected from the Microsoft Kinect device. The approach comprises two significant components as the contribution: the pose-based feature extraction under the spatio-temporal relation and the topic model based learning. For extracting feature, the distance between two articulated points and the angle between horizontal axis and joint vector are measured and normalized on each detected body. A codebook is then constructed using the K-means algorithm to encode the merged set of distance and angle. For modeling activities from sparse features, a hierarchical model developed on the Pachinko Allocation Model is proposed to describe the flexible relationship between features - poselets - activities in the temporal dimension. Finally, the activities are classified by using three different state-of-the-art machine learning techniques: Support Vector Machine, K-Nearest Neighbor, and Random Forest. In the experiment, the proposed approach is benchmarked and compared with existing methods in the overall classification accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.