Abstract

Noncontact human-computer interaction has an important value in wireless sensor networks. This work is aimed at achieving accurate interaction on a computer based on auto eye control, using a cheap webcam as the video source. A real-time accurate human-computer interaction system based on eye state recognition, rough gaze estimation, and tracking is proposed. Firstly, binary classification of the eye states (opening or closed) is carried on using the SVM classification algorithm with HOG features of the input eye image. Second, rough appearance-based gaze estimation is implemented based on a simple CNN model. And the head pose is estimated to judge whether the user is facing the screen or not. Based on these recognition results, noncontact mouse control and character input methods are designed and developed to replace the standard mouse and keyboard hardware. Accuracy and speed of the proposed interaction system are evaluated by four subjects. The experimental results show that users can use only a common monocular camera to achieve gaze estimation and tracking and to achieve most functions of real-time precise human-computer interaction on the basis of auto eye control.

Highlights

  • The wireless sensor network (WSN) consists of many sensors like visual sensors, thermal sensors, and various others

  • Capturing the eye tracking signals can help the disabled improve their quality of life by noncontact human-computer communicating with visual sensors; for example, this can be applied to the eye control wheelchair for the disabled [2]

  • Eye movement is closely related to brain activity

Read more

Summary

Introduction

The wireless sensor network (WSN) consists of many sensors like visual sensors, thermal sensors, and various others. Sensor nodes are widely used for nonstop sensing, event detection, position sensing, and many other things, including helping the disabled with interfaces [1]. Capturing the eye tracking signals can help the disabled improve their quality of life by noncontact human-computer communicating with visual sensors; for example, this can be applied to the eye control wheelchair for the disabled [2]. Eye movement is closely related to brain activity. The process of human learning and cognition can be studied through eye movement [3]. This is how the eye tracking technique was born. Eye tracking can obtain the focus of sight in real time and be applied to analyze the user’s eye movement in reading [3] or in a critical state [4], so as to infer the user’s content of interest in reality [5, 6]

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call