Abstract With the rapid progress of WiFi technology, WiFi-based wireless sensing technology has opened up new ways for contactless human-computer interaction. However, hand gesture recognition technology faces low-quality data sets, insufficient model robustness, poor universality when the application scenario changes, high training costs, and weak generalization ability. To this end, this study innovatively proposes the Wi-TCG method, which combines transfer learning and conditional generative adversarial network (CGAN) to optimize WiFi gesture recognition. This method uses commercial Wi-Fi devices to collect channel state information (CSI) of gesture actions. It innovatively extracts Doppler shift image data as the input of CGAN to generate virtual data with similar characteristics to expand the training sample set. The network is fine-tuned using transfer learning techniques to recognize multiple gesture action categories in different scenarios accurately. In tests of two new natural scenes and six new gesture categories, the Wi-TCG method achieved a high recognition accuracy of 93.1%, providing strong support for applying WiFi-based wireless sensing technology in contactless human-computer interaction.
Read full abstract