Abstract
Recognising and classifying human hand gestures is important for effective communication between humans and machines in applications such as human-robot interaction, human to robot skill transfer, and control of prosthetic devices. Although there are already many interfaces that enable decoding of the intention and action of humans, they are either bulky or they rely on techniques that need careful positioning of the sensors, causing inconvenience when the system needs to be used in real-life scenarios and environments. Moreover, electromyography (EMG), which is the most commonly used technique, captures EMG signals that have a nonlinear relationship with the human intention and motion. In this work, we present lightmyography (LMG) a new muscle machine interfacing method for decoding human intention and motion. Lightmyography utilizes light propagation through elastic media and the change of light luminosity to detect silicone deformation. Lightmyography is similar to forcemyography in the sense that they both record muscular contractions through skin displacements. In order to experimentally validate the efficiency of the proposed method, we designed an interface consisting of five LMG sensors to perform gesture classification experiments. Using this device, we were able to accurately detect a series of different hand postures and gestures. We also compared LMG data with processed EMG data.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have