Abstract

This research delves into the domain of Human Activity Recognition (HAR) through sensor data analysis, offering a comprehensive exploration of three diverse datasets: UniMiB-SHAR, Motion Sense, and WISDM Actitracker. The UniMiB-SHAR dataset encompasses a diverse array of linear as well as non-linear and complex activities which involve the movement of more than one joint or muscle (for example Hitting Obstacles, jogging and falling with face down). This motion generates highly correlated sensor readings over a certain period of time. In this case, Convolution Neural Networks (CNNs) are effective in feature extraction as well as classification of HAR activities, but they may not fully grasp the combined features of spatial as well as temporal aspects in the HAR datasets and heavily rely on labelled data. Whereas, Graph convolution networks (GCN), with their capacity to model complex interactions through graph structure, complement CNN’s capabilities in classifying non-linear activities in the HAR dataset. By leveraging the Knowledge graph structure and acquiring the feature embeddings from the GCN model, in this study, a Noval ensemble CNN model is proposed for the classification of activities. The novel HAR pipeline is termed as Graph Engineered EnsemCNN HAR (GE-EnsemCNN-HAR) and its performance is evaluated on HAR datasets. Proposed model demonstrated a noteworthy accuracy of 93.5% on UniMiB-SHAR dataset, surpassing the Shallow CNN model with GNN with an improvement of 20.14%. The proposed model achieved a notable accuracy rate of 96.18% and 98% when evaluated on the Motion Sense and WISDM Actitracker dataset.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.