Abstract

Human-computer interface systems whose input is based on eye movements can serve as a means of communication for patients with locked-in syndrome. Eye-writing is one such system; users can input characters by moving their eyes to follow the lines of the strokes corresponding to characters. Although this input method makes it easy for patients to get started because of their familiarity with handwriting, existing eye-writing systems suffer from slow input rates because they require a pause between input characters to simplify the automatic recognition process. In this paper, we propose a continuous eye-writing recognition system that achieves a rapid input rate because it accepts characters eye-written continuously, with no pauses. For recognition purposes, the proposed system first detects eye movements using electrooculography (EOG), and then a hidden Markov model (HMM) is applied to model the EOG signals and recognize the eye-written characters. Additionally, this paper investigates an EOG adaptation that uses a deep neural network (DNN)-based HMM. Experiments with six participants showed an average input speed of 27.9 character/min using Japanese Katakana as the input target characters. A Katakana character-recognition error rate of only 5.0% was achieved using 13.8 minutes of adaptation data.

Highlights

  • Eye movement-based communication is extremely important for people such as patients with amyotrophic lateral sclerosis (ALS) who have lost most their ability to control voluntary movements, including loss of speech and handwriting but not eye movement [1, 2]

  • Instead of using a caregiver, this study investigates human-computer interface systems whose input is based on eye movements

  • A two-channel EOG signal was obtained by taking the differences of the signals between the electrodes on the left and right sides and those between the upper and lower sides, respectively

Read more

Summary

Introduction

Eye movement-based communication is extremely important for people such as patients with amyotrophic lateral sclerosis (ALS) who have lost most their ability to control voluntary movements, including loss of speech and handwriting but not eye movement [1, 2] For these patients, the most common means of communication is to have a caregiver face the patient through a transparent character board and identify which character the patient is looking at [3]. Instead of using a caregiver, this study investigates human-computer interface systems whose input is based on eye movements Based on whether these systems require a computer screen, they can be split into two groups. Among those that use a screen, Kate et al [4] and Majaranta et al [5] designed an on-screen keyboard selection system in which a user could

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call