Abstract

The Human Activity Recognition (HAR) is a pattern recognition task that learns to identify human physical activities recorded by different sensor modalities. The application areas include human behavior analysis, ambient assistive living, surveillance-based security, gesture recognition, and context-aware computing. The HAR remains challenging as the sensor data is noisy in nature and the activity signal varies from person to person. To recognize different types of activity with a single classifier is often error-prone. To mitigate this problem, we introduced an adaptive human activity recognition model. We present a two-stage learning process to recognize human activity recorded using a waist-mounted accelerometer and gyroscope sensor. In the first step, we classify activity into static and moving, using a Random Forest (RF) binary classifier. In the second step, we adopt a Support Vector Machine (SVM) to identify individual static activity and 1D Convolutional Neural Network (CNN)-based deep learning model for individual moving activity recognition. This makes our approach more robust and adaptive. The static activity has less frequency variation in features compared to dynamic activity waveforms for CNN to learn. On the other hand, SVM demonstrated superior performance to recognize static activities but performs poorly on moving, complex, and uncertain activity recognition. Our method is similarly robust to different motion intensity and can also capture the variation of the same activity effectively. In our hybrid model, the CNN captures local dependencies of activity signals as well as preserves the scale invariance. We achieved 97.71% overall accuracy on six activity classes of widely accepted benchmark UCI-HAR dataset.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.