Abstract

To facilitate hand gesture recognition, we investigated the use of acoustic signals with an accelerometer and gyroscope at the human wrist. As a proof-of-concept, the prototype consisted of 10 microphone units in contact with the skin placed around the wrist along with an inertial measurement unit (IMU). The gesture recognition performance was evaluated through the identification of 13 gestures used in daily life. The optimal area for acoustic sensor placement at the wrist was examined using the minimum redundancy and maximum relevance feature selection algorithm. We recruited 10 subjects to perform over 10 trials for each set of hand gestures. The accuracy was 75% for a general model with the top 25 features selected, and the intra-subject average classification accuracy was over 80% with the same features using one microphone unit at the mid-anterior wrist and an IMU. These results indicate that acoustic signatures from the human wrist can aid IMU sensing for hand gesture recognition, and the selection of a few common features for all subjects could help with building a general model. The proposed multimodal framework helps address the single IMU sensing bottleneck for hand gestures during arm movement and/or locomotion.

Highlights

  • Human fingers are one of the main means of interaction with the world and are an essential body part in the study of gesture recognition technologies in the field of human–computer interaction (HCI)

  • A mutual information-based feature selection algorithm Minimum redundancy maximum relevance (mRMR) was applied for feature selection over the cascaded feature set

  • This resulted in a 75% accuracy of the general model, notably using acoustic information from the mid-anterior wrist and the data from the inertial measurement unit (IMU) chip

Read more

Summary

Introduction

Human fingers are one of the main means of interaction with the world and are an essential body part in the study of gesture recognition technologies in the field of human–computer interaction (HCI). Gesture recognition technology allows humans to interact with a remote system without physical contact. The conventional data glove records finger orientation by measuring flexion, vision-based devices commonly use a camera with a depth-based sensor, and muscle activity-based apparatuses record muscular contractions, known as surface electromyography (sEMG). The recordings of these time-series measurements of hand gestures are linked to some instruction in a computer.

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call