Abstract

Objective: To provide automatically analyzing and detecting human activities to provide better support in healthcare sector, security purpose etc. Method: We have used UT Kinect-Action 3D dataset containing position of 20 body joint captured by Kinect sensor. We selected two set of joints J1 and J2; after that we have formed some rules for activity classification then we have applied SVM classifier, KNN classifier using Euclidean distance and KNN classifier using minkowski distance for activity classification. Findings: When we have used joint set J1 we got 97.8% accuracy with SVM classifier, 98.8% accuracy with KNN classifier using Euclidean distance, and 98.9% accuracy with KNN classifier using minkowski distance and for joint set J2 we got 97.7% accuracy with SVM classifier, 98.6% accuracy with KNN classifier using Euclidean distance, and 98.7% accuracy with KNN classifier using minkowski distance. Application/Improvement: we have classified four activities hand waving, standing, sitting and picking. In future more activities can also be included in this study. IOT along with this activity recognition method can be used to reduce overheads.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.