Abstract

In this paper, a low-cost wearable hand gesture detecting system based on distributed multi-node inertial measurement units (IMUs) and central node microcontroller is presented. It can obtain hand kinematic information and transmit data to the remote processing terminal wirelessly. To have a comprehensive understanding of hand kinematics, a convolutional neural network (CNN) model on the terminal is proposed to recognize and classify gestures and the modified Denavit-Hartenberg notation is used to acquire finger spatial locations. The experiment has not only completed a variety of gesture recognitions, but also captured and displayed the orientation and posture of a single finger. The prototype can be used in various occasions such as hand rehabilitation evaluation and human-computer interaction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call