Abstract

Accurate sensing and understanding of gestures can improve the quality of human–computer interaction and show great theoretical significance and application potentials in the fields of smart home, assisted medical care and virtual reality. WiFi channel state information (CSI)-based device-free wireless gesture recognition requires no sensors and has a series of advantages such as permission for non-line-of-sight scenario, low cost, preserving for personal privacy and working in the dark night. Although most of the current WiFi CSI-based gesture recognition approaches can achieve good performance, they are difficult to adapt to the new domains. Therefore, this paper proposes ML-WiGR, a novel approach for device-free gesture recognition in cross-domain applications. ML-WiGR applies convolutional neural networks (CNN) and long short-term memory (LSTM) neural networks as the basic model for gesture recognition to extract spatial and temporal features. Combined with the meta-learning training mechanism, ML-WiGR can dynamically adjust the learning rate and meta-learning rate in training process adaptively and optimize the initial parameters of a basic model for gesture recognition, only using a few samples and several iterations to adapt to the new domain. In the experiments, the approach is tested under a variety of scenarios. The results show that ML-WiGR can achieve comparable performance against existing approaches with only a small number of samples for training in cross-domains.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.