Abstract

In this paper, we design a human computer interaction (HCI) system for the locked-in syndrome (LIS) patients which combines the eye movement signals and Electroencephalogram (EEG) signals. The LIS describes patients who are faced with the quadriplegia, aphonia and facial paralysis problems. However, they are still conscious, have the ability to hear and give a hint with the blink or eye movements. Thus, the LIS patients suffer great miseries due to the difficulty to communicate with the outside. This HCI system can help the LIS patients effectively communicate with the outside. Firstly, the system gets the subject's gaze point on the screen through the acquisition data of eye movement information, and obtains the button that the subject is focusing on. Then, the system confirms or cancels the button according to the classification results of the EEG signals. The computer can execute the commands of the corresponding button to bridge the communication between the subjects and the outside. When the subject stares at the screen unconsciously, the button will not respond because there is no confirmation of the EEG signals. The subject's eyes can be relaxed. The EEG signals are divided into two categories, and the classification results can be trained to reach a high classification accuracy (more than 90%). In the test, the average accuracy is about 80% in the case of using eye tracker alone. In the case of EEG, the average accuracy can reach more than 85%. The HCI system can realize some hardware control, simple voice interaction and typing with virtual keyboard. The system can make subjects communicate with the outside through the eye movement and EEG signals with high accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call