Abstract

With the development of multimedia technology, traditional interactive tools, such as mouse and keyboard, cannot satisfy usersā€™ requirements. Touchless interaction has received considerable attention in recent years with benefit of removing barriers of physical contact. Leap Motion is an interactive device which can be used to collect information of dynamic hand gestures, including coordinate, acceleration and direction of fingers. The aim of this study is to develop a new method for hand gesture recognition using jointly calibrated Leap Motion via deterministic learning. Hand gesture features representing hand motion dynamics, including spatial position and direction of fingers, are derived from Leap Motion. Hand motion dynamics underlying motion patterns of different gestures which represent Arabic numbers (0-9) and capital English alphabets (A-Z) are modeled by constant radial basis function (RBF) neural networks. Then, a bank of estimators is constructed by the constant RBF networks. By comparing the set of estimators with a test gesture pattern, a set of recognition errors are generated. The average L1 norms of the errors are taken as the recognition measure according to the smallest error principle. Finally, experiments are carried out to demonstrate the high recognition performance of the proposed method. By using the 2-fold, 10-fold and leave-one-person-out cross-validation styles, the correct recognition rates for the Arabic numbers are reported to be 94.2%, 95.1% and 90.2%, respectively, for the English alphabets are reported to be 89.2%, 92.9% and 86.4%, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call