Abstract

This research reports the recognition of facial movements during unvoiced speech and the identification of hand gestures using surface Electromyogram (sEMG). The paper proposes two different methods for identifying facial movements and hand gestures, which can be useful for providing simple commands and control to computer, an important application of HCI. Experimental results demonstrate that the features of sEMG recordings are suitable for characterising the muscle activation during unvoiced speech and subtle gestures. The scatter plots from the two methods demonstrate the separation of data for each corresponding vowel and each hand gesture. The results indicate that there is small inter-experimental variation but there are large intersubject variations. This inter-subject variation may be attributable to anatomical differences and different speed and style of speaking for the different subjects. The proposed system provides better results when is trained and tested by individual user. The possible applications of this research include giving simple commands to computer for disabled, developing prosthetic hands, use of classifying sEMG for HCI based systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.