Abstract

Human activity recognition (HAR) in real-world settings has gained significance due to the growth of Internet of Things (IoT) devices such as smartphones and smartwatches. Nonetheless, limitations such as fluctuating environmental conditions and intricate behavioral patterns have impacted the accuracy of the current procedures. This research introduces an innovative methodology employing a modified deep residual network, called 1D-ResNeXt, for IoT-enabled HAR in uncontrolled environments. We developed a comprehensive network that utilizes feature fusion and a multi-kernel block approach. The residual connections and the split–transform–merge technique mitigate the accuracy degradation and reduce the parameter number. We assessed our suggested model on three available datasets, mHealth, MotionSense, and Wild-SHARD, utilizing accuracy metrics, cross-entropy loss, and F1 score. The findings indicated substantial enhancements in proficiency in recognition, attaining 99.97% on mHealth, 98.77% on MotionSense, and 97.59% on Wild-SHARD, surpassing contemporary methodologies. Significantly, our model attained these outcomes with considerably fewer parameters (24,130–26,118) than other models, several of which exceeded 700,000 parameters. The 1D-ResNeXt model demonstrated outstanding effectiveness under various ambient circumstances, tackling a significant obstacle in practical HAR applications. The findings indicate that our modified deep residual network presents a viable approach for improving the dependability and usability of IoT-based HAR systems in dynamic, uncontrolled situations while preserving the computational effectiveness essential for IoT devices. The results significantly impact multiple sectors, including healthcare surveillance, intelligent residences, and customized assistive devices.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.