Abstract

Human-computer interactioñ (HCI) has become a popular research field in recent decades. Many HCI systems are based on bio-signal analysis and classification. One of the important signals is the surface electromyographic (sEMG) signal which is formed by muscle activities. The sEMG signal plays an important role in many applications such as human-computer interaction, rehabilitation devices, clinical diagnostics, and so on. All these applications are referred to as myoelectric control. With deep research on myoelectric control, however, challenges have appeared. One difficulty in EMG-based gesture recognition is the influence of limb position. There are some papers indicating that the accuracy of gesture classification decreases when the limb positions change even if the gesture remains the same. Prior work by our team has shown that dynamic gestures are in principle more reliable indicators of human intent. In addition, deep learning has achieved good performances in many conditions, from automated driving to natural language processing. In this paper, a neural network model named CNN-LSTM network is proposed to enable dynamic hand gestures recognitionn involving five different gestures. The gestures would be performed in five different arm positions as well. A neural network is then employed in human-computer interaction(HCI) system to control a 6-DoF robot arm with 1-DoF gripper.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call