Abstract

Due to the breadth of its application domains, Hu-man Activity Recognition (HAR) is a problematic area of human-computer interaction. HAR can be used in remote monitoring of senior healthcare and concern situations in intelligent man-ufacturing, among other applications. HAR based on wearable inertial sensors has been researched identification efficiency in various kinds of human actions considerably more than vision-based HAR. The sensor-based HAR is generally applicable to indoor and outdoor locations without privacy considerations of implementation. In this research, we explore the recognition performance of multiple deep learning (DL) models to recognize everyday living human activities. We developed a deep residual neural network that employed aggregated multi-branch transformation to boost identification performance. The proposed model is called the ResNeXt model. To evaluate its performance, three standard DL models (CNN, LSTM, and CNN-LSTM) are investigated and compared to our proposed model using a standard HAR dataset called Daily Living Activity dataset. These datasets gathered mobility signal data from multimodal sensors (accelerometer, gyroscope, and magnetometer) in three distinct body areas (wrist, hip, and ankle). The experimental findings reveal that the proposed model surpasses other benchmark DL models with maximum accuracy and F1-scores. Furthermore, the findings show that the ResNeXt model is more resistant than other models with fewer training parameters.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.