Abstract

Recent advances in sensors and electronics have enabled electrooculogram (EOG) detection systems for capturing eye movements. However, EOG signals are susceptible to the sensor's skin‐contact quality, limiting the precise detection of eye angles and gaze. Herein, a two‐camera eye‐tracking system and a data classification method for persistent human–machine interfaces (HMIs) are introduced. Machine‐learning technology is used for a continuous real‐time classification of gaze and eye directions, to precisely control a robotic arm. In addition, a deep‐learning algorithm for classifying eye directions is developed and the pupil center‐corneal reflection method of an eye tracker for gaze tracking is utilized. A supervisory control and data acquisition architecture that can be universally applied to any screen‐based HMI task are used by the system. It is shown in the study that the classification algorithm using deep learning enables exceptional accuracy (99.99%) with the number of actions per command (≥64), the highest performance compared to other HMI systems. Demonstrating real‐time control of a robotic arm captures the unique advantages of the precise eye‐tracking system for playing chess and manipulating dice. Overall, this paper shows the HMI system's potential for remote control of surgery robots, warehouse systems, and construction tools.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call