Abstract
Efficiently identifying activities of daily living (ADL) provides very important contextual information that is able to improve the effectiveness of various sports tracking and healthcare applications. Recently, attention mechanism that selectively focuses on time series signals has been widely adopted in sensor based human activity recognition (HAR), which can enhance interesting target activity and ignore irrelevant background activity. Several attention mechanisms have been investigated, which achieve remarkable performance in HAR scenario. Despite their success, prior these attention methods ignore the cross-interaction between different dimensions. In the paper, in order to avoid above shortcoming, we present a triplet cross-dimension attention for sensor-based activity recognition task, where three attention branches are built to capture the cross-interaction between sensor dimension, temporal dimension and channel dimension. The effectiveness of triplet attention method is validated through extensive experiments on four public HAR dataset namely UCI-HAR, PAMAP2, WISDM and UNIMIB-SHAR as well as the weakly labeled HAR dataset. Extensive experiments show consistent improvements in classification performance with various backbone models such as plain CNN and ResNet, demonstrating a good generality ability of the triplet attention. Visualization analysis is provided to support our conclusion, and actual implementation is evaluated on a Raspberry Pi platform.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Emerging Topics in Computational Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.