Abstract

Brain-computer interfaces (BCIs) largely augment human capabilities by translating brain wave signals into feasible commands to operate external devices. However, many issues face the development of BCIs such as the low classification accuracy of brain signals and the tedious human-learning procedures. To solve these problems, we propose to use signals associated with eye saccades and blinks to control a BCI interface. By extracting existing physiological eye signals, the user does not need to adapt his/her brain waves to the device. Furthermore, using saccade signals to control an external device frees the limbs to perform other tasks. In this research, we use two electrodes placed on top of the left and right ears of thirteen participants. Then we use Independent Component Analysis (ICA) to extract meaningful EEG signals associated with eye movements. A sliding-window technique was implemented to collect relevant features. Finally, we classified the features as horizontal or blink eye movements using KNN and SVM. We were able to achieve a mean classification accuracy of about 97%. The two electrodes were then integrated with off-the-shelf earbuds to control a wheelchair. The earbuds can generate voice cues to indicate when to rotate the eyeballs to certain locations (i.e., left or right) or blink, so that the user can select directional commands to drive the wheelchair. In addition, through properly designing the contents of voice menus, we can generate as many commands as possible, even though we only have limited numbers of states of the identified eye saccade movements.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call