Abstract

Human Activity Recognition (HAR) has been widely used for various applications, such as smart homes, healthcare, security and human-robot interaction. In this paper, a novel deep learning model is proposed based on different Convolution Neural Networks (CNN) and the attention mechanism for addressing the HAR problem by using sensory data from accelerometers and gyroscopes and providing high recognition accuracy. The architecture of the proposed model (named LGSTNet) is designed to be able to analyse and extract Local and Global Spatial-Temporal features from sensory data for HAR. The idea of LGSTNet is to segment an activity window into several sub-windows and join the Attention mechanism into an 2D CNN (named Attention-based 2D-CNN module) to precisely learn local spatial-temporal features from those sub-windows. Meanwhile, an 3D CNN (named 3D-CNN module) is designed to learn global spatial-temporal features from the whole window. Both comparison and ablation experiments are conducted on two real-world datasets, named <monospace xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">WISDM</monospace> and <monospace xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">UCI-HAR</monospace> . By comparing LGSTNet to eight other well-known existing models and its variants on both datasets, LGSTNet achieves the best performance (over 0.8064 and <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$0.9569\,\,{F}_{w}-\textit {Meansure}$ </tex-math></inline-formula> on <monospace xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">WISDM</monospace> and <monospace xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">UCI-HAR</monospace> , respectively) and is also more robust.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.