Abstract
The advances in deep learning with the ability to automatically extract advanced features have achieved a bright prospect for human activity recognition (HAR). However, the traditional HAR methods still have the deficiencies of incomplete feature extraction, which may lead to incorrect recognition results. To resolve the above problem, a novel framework for spatiotemporal multi-feature extraction on HAR called CapsGaNet is propounded, which is based on capsule and gated recurrent units (GRU) with attention mechanisms. The proposed framework involves a spatial feature extraction layer consisting of capsule blocks, a temporal feature extraction layer consisting of GRU with attention mechanisms, and an output layer. At the same time, considering the actual demands for recognizing aggressive activities in some specific scenarios like smart prison, we constructed a daily and aggressive activity dataset (DAAD). Moreover, based on the acceleration characteristics of aggressive activity, a threshold-based approach for aggressive activity detection is propounded to meet the needs of high real-time and low computational complexity in prison scenarios. The experiments are performed on the wireless sensor data mining (WISDM) dataset and the DAAD dataset, and the results verify that the propounded CapsGaNet could effectually improve the recognition accuracy. The proposed threshold-based approach for aggressive activity detection provides a more effective HAR way by using smart sensor devices in smart prison scenarios.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.