Abstract

Human activity recognition (HAR) based on wearable sensors has developed as a new study topic in the domains of artificial intelligence and pattern recognition. HAR has a wide range of applications, including sports activity detection, smart homes, and health assistance, to name a few. Mobile device sensors such as accelerometers, gyroscopes, and magnetometers can generate time-series data for HAR. Computer Vision (CV) methods were previously utilised for HAR, which has a number of drawbacks, including mobility, ambient conditions, occlusion, higher cost, and, most importantly, privacy. Using sensor data instead of typical computer vision techniques has various advantages. Their work is believed to have overcome virtually all of the limitations of computer vision techniques. The use of Machine Learning (ML) and Deep Neural Networks (DNN) to recognise human activity from inertial sensor data is widely established in the literature. In this paper, we introduce HARResNeXT, a novel convolutional neural network inspired by ResNeXT. It classifies Human Activities based on inertial sensors data of smartphone. The presented model has been evaluated on a dataset by WISDM (Wireless Sensor Data Mining) Lab. We have achieved 97\% Precision, Recall and F1-score. Moreover, the average accuracy achieved is 96.62\%. Comparison with previous studies showed the presented model out-performed state-of-the-art.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call