Abstract
We evaluate the performance of a wearable gesture recognition system for arm, hand, and finger motions, using the signals of an Inertial Measurement Unit (IMU) worn at the wrist, and the Electromyogram (EMG) of muscles in the forearm. A set of 12 gestures was defined, similar to manipulatory movements and to gestures known from the interaction with mobile devices. We recorded performances of our gesture set by five subjects in multiple sessions. The resulting data corpus is made publicly available to build a common ground for future evaluations and benchmarks. Hidden Markov Models (HMMs) are used as classifiers to discriminate between the defined gesture classes. We achieve a recognition rate of 97.8 % in session-independent, and of 74.3 % in person-independent recognition. We give a detailed analysis of error characteristics and of the influence of each modality to the results to underline the benefits of using both modalities together.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have