Abstract

In daily life, people use their hands in various ways for most daily activities. There are many applications based on the position, direction, and joints of the hand, including gesture recognition, gesture prediction, robotics and so on. This paper proposes a gesture prediction system that uses hand joint coordinate features collected by the Leap Motion to predict dynamic hand gestures. The model is applied to the NAO robot to verify the effectiveness of the proposed method. First of all, in order to reduce jitter or jump generated in the process of data acquisition by the Leap Motion, the Kalman filter is applied to the original data. Then some new feature descriptors are introduced. The length feature, angle feature and angular velocity feature are extracted from the filtered data. These features are fed into the long-short time memory recurrent neural network (LSTM-RNN) with different combinations. Experimental results show that the combination of coordinate, length and angle features achieves the highest accuracy of 99.31%, and it can also run in real time. Finally, the trained model is applied to the NAO robot to play the finger-guessing game. Based on the predicted gesture, the NAO robot can respond in advance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.