Abstract

This work introduces a novel human–computer interface based on electromyography (EMG). This tool allows the user to control the cursor on a computer screen through EMG activity resulting from specific facial movements. This type of human–computer interface may be useful for individuals who want to interact with computers and suffer from movement limitations of arms and hands. Although there are a number of EMG-based human–computer interfaces described in literature, most of them are not assessed with regard to the learning curve resulting from the interaction with such interfaces, being this factor one of the main contributions of the presented study. Another contribution of the investigation is the proposal and evaluation of a complete and practical solution that implements a two-channel EMG interface for generating seven distinct states which can be used as output commands. In the study, a Finite State Machine, which is the core of the system, is responsible for the conversion of features extracted from EMG signals into commands (i.e., SINGLE_CLICK, UP, DOWN, LEFT, RIGHT, ROTATE, and ON_STANDBY) used for the control of the cursor on a computer screen. The tool uses only two channels of information that combines the muscle activity of three facial muscles, i.e., the Left and Right Temporalis and the Frontalis. In order to evaluate learning when using the tool a customized graphical user interface was devised. This interface allowed subjects to execute pre-defined timed actions with distinct levels of difficulty. In total, 10 healthy subjects and a single subject suffering from muscular dystrophy were involved in the experiments. Approximately 60h of practical experiments were carried out. The results suggest that just after one training session subjects could control the cursor on a computer screen, and also that incremental learning is verified over training sessions. Therefore, the devised tool may be integrated with specific programs and used by individuals whose facial muscles are not severely damaged.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call