Abstract

Hand gestures are a well-known and straightforward method of human-computer interaction. The majority of the study focused on hand gesture recognition. However, little work has been done to develop a complete set of gesture recognition applications. With the improvement of model feature extraction ability and the increase in the number of model parameters, it is becoming more challenging to achieve a small memory footprint on mobile devices based on an ARM architecture or CPU devices based on x86 architecture. However, these existing methods are heavy, requiring more memory and inference time. The execution of memory-efficient CNNs without compromising accuracy has been a challenge, especially when the inference has to be performed on an edge computing device in real time. Therefore, we propose a lightweight network for hand gesture recognition (LHGR-Net) and deploy it on a Raspberry Pi. LHGR-Net consists of three main parts: the base network structure, the multiscale structure (MSS), and the lightweight attention structure (LAS). We present pre-trained weights that are learned from other data to initialize the network structure. In addition, the LHGR-Net model was made to be deployed on a Raspberry Pi, and a deployed model can be used to control home appliances. Extensive experiments show that our method achieves almost as good as state-of-the-art performance in hand gesture recognition and running time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call