Abstract

A home-auxiliary robot system based on characteristics of the electrooculogram (EOG) and tongue signal is developed in the current study, which can provide daily life assistance for people with physical mobility disabilities. It relies on five simple actions (blinking twice in a row, tongue extension, upward tongue rolling, and left and right eye movements) of the human head itself to complete the motions (moving up/down/left/right and double-click) of a mouse in the system screen. In this paper, the brain network and BP neural network algorithms are used to identify these five types of actions. The result shows that, for all subjects, their average recognition rates of eye blinks and tongue movements (tongue extension and upward tongue rolling) were 90.17%, 88.00%, and 89.83%, respectively, and after training, the subjects can complete the five types of movements in sequence within 12 seconds. It means that people with physical disabilities can use the system to quickly and accurately complete life self-help, which brings great convenience to their lives.

Highlights

  • E eye movement signal, which contains important thinking information, is a very easy signal to observe

  • Niu used eye movement signals to distinguish the pilots’ different cognitive levels of driving [41]. e research conducted by Brookings and Wilson shows that the EOG signals can be used to estimate the cognitive requirements of different tasks [42, 43]

  • A home-auxiliary robot system based on characteristics of human physiological and motion signals is developed in the current study

Read more

Summary

Introduction

E eye movement signal, which contains important thinking information, is a very easy signal to observe. We found that when using Emotiv equipment to collect signals, human eye movement and tongue movement signals can be detected in the time domain. We used the five kinds of human head movements (blinking twice in a row, tongue extension, upward tongue rolling, and left and right eye movements) to control the home-auxiliary robot system and used this system for taking articles for daily use. We randomly selected 12 subjects (6 males and 6 females; aged 28 ± 1.6 (SD)) from volunteers to participate in the experiment They were required to meet these conditions of no history of neurological diseases or visual illness. When we use EEG equipment such as Emotiv and Neuroscan to collect signals, the tongue signals are detected in the time domain. When we use EEG equipment such as Emotiv and Neuroscan to collect signals, the tongue signals are detected in the time domain. us, AF3 and AF4 signals were used to find tongue and eye movements in this study

Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.