Abstract

Human activity recognition (HAR) based on channel state information (CSI) plays an increasingly important role in the research of human–computer interaction. Many CSI HAR models based on traditional machine learning methods and deep learning methods have encountered two challenges. A lot of CSI activity data is needed to train the HAR models, which is time consuming. When the indoor environment or scene changes, the recognition accuracy of the model drops significantly, so it is necessary to recollect data to train the model. The existing few-shot learning-based method can solve the above problems to some extent, but when there are more kinds of new activities or fewer shots, the recognition accuracy will decrease significantly. In this article, considering the relationship between various activity data, a graph-based few-shot learning method with dual attention mechanism (CSI-GDAM) is proposed to perform CSI-based HAR. The model uses a feature extraction layer, including the convolutional block attention module (CBAM), to extract activity-related information in CSI data. The difference and inner product of the feature vector of the CSI activity samples are used to realize the graph convolutional network with a graph attention mechanism. The experiments proved that under the learning task of recognizing new activities in the new environment, the recognition accuracy rates reached 99.74% and 98.42% in the 5-way 5-shot and 5-way 1-shot cases, respectively. The proposed method is also compared with other few-shot learning and transfer learning methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.