Abstract
The sensor-based human activity recognition (HAR) in mobile application scenarios is often confronted with variation in sensing modalities and deficiencies in annotated samples. To address these two challenging problems, we devised a graph-inspired deep learning approach that uses data from human-body mounted wearable sensors. As a step toward a complete HAR solution, the proposed method was further used to build a deep transfer learning model. Specifically, we present a multi-layer residual structure involving graph convolutional neural network (ResGCNN) toward the sensor-based HAR tasks, namely the HAR-ResGCNN approach. Experimental results on the PAMAP2 and mHealth data sets demonstrate that our ResGCNN is effective at capturing the characteristics of actions with comparable results compared to other sensor-based HAR models (with an average accuracy of 98.18% and 99.07%, respectively). More importantly, the parameter-based transfer learning experiments using the ResGCNN model show excellent transferability and small sample learning ability, which is a promising solution in sensor-based HAR applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.