Abstract
Channel state information (CSI)-based human activity recognition (HAR) has important application prospects, such as smart homes, medical monitoring, and public security. Due to the collected CSI data contains not only activity information but also activity-unrelated environmental information, the characteristics of the same activity conducted at different locations are different. The existing methods of collecting samples at a fixed location to train the HAR model can hardly work well at other locations, which limits the application prospect of CSI-based HAR. To deal with this challenge, we proposed an Attention-based feature Fusion ACTivity recognition system (AF-ACT). The proposed system extracts the semantic activity features and temporal features from different dimensions to better characterize the activity at different locations. The semantic activity features are extracted by the convolutional neural network (CNN) combined with the convolutional attention module (CBAM), and the temporal features are extracted by bidirectional gated recurrent unit (BGRU) combined with the self-attention mechanism. The semantic activity features and temporal features are fused through an attention-based feature fusion (A-Fusion) module to obtain complementary information, which will promote recognition accuracy. The proposed system is evaluated in an open environment with 12 activity training locations and ten arbitrary testing locations. The experimental results show that the system can reach the highest accuracy of 91.23% in different experimental conditions when recognizing eight categories of activities.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Instrumentation and Measurement
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.