Abstract

Assistant devices such as meal-assist robots aid individuals with disabilities and support the elderly in performing daily activities. However, existing meal-assist robots are inconvenient to operate due to non-intuitive user interfaces, requiring additional time and effort. Thus, we developed a hybrid brain–computer interface-based meal-assist robot system following three features that can be measured using scalp electrodes for electroencephalography. The following three procedures comprise a single meal cycle. (1) Triple eye-blinks (EBs) from the prefrontal channel were treated as activation for initiating the cycle. (2) Steady-state visual evoked potentials (SSVEPs) from occipital channels were used to select the food per the user’s intention. (3) Electromyograms (EMGs) were recorded from temporal channels as the users chewed the food to mark the end of a cycle and indicate readiness for starting the following meal. The accuracy, information transfer rate, and false positive rate during experiments on five subjects were as follows: accuracy (EBs/SSVEPs/EMGs) (%): (94.67/83.33/97.33); FPR (EBs/EMGs) (times/min): (0.11/0.08); ITR (SSVEPs) (bit/min): 20.41. These results revealed the feasibility of this assistive system. The proposed system allows users to eat on their own more naturally. Furthermore, it can increase the self-esteem of disabled and elderly peeople and enhance their quality of life.

Highlights

  • We analyzed the state visual evoked potentials (SSVEPs) based on the following conditions: (1) one channel: Oz, epoch period: 3 s (average accuracy (%): 71.33, average information transfer rate (ITR): 18.46); (2) three channels: O1/Oz/O2, epoch period: 3 s (average accuracy (%): 70.67, average ITR: 17.47); (3) one channel: Oz, epoch period: 4 s (average accuracy (%): 77.33, average ITR

  • This study implemented a hybrid brain–computer interfaces (BCIs)-based meal-assist robot, wherein the accuracy of the algorithm was examined, based on an experiment to verify the feasibility of a realof the algorithm was examined, based on an experiment to verify the feasibility of a realtime practical system

  • This study focused on a hybrid BCI-based meal-assist robot for real-life applications

Read more

Summary

Introduction

The population of elderly and disabled individuals is increasing worldwide [1,2] These individuals experience challenges in their lives without therapists. Their self-esteem decreases if they cannot even perform basic activities such as walking and eating, which increases their helplessness. Robots have been actively developed for applications in healthcare [3], such as home-assist robots [4], exoskeleton robots [5], and meal-assist robots [6]. These robots have progressed from being directly controlled to completely automatic applications. There are various studies on healthcare robots based on augmented/virtual reality (AR/VR) [9], voice [7], and electroencephalograms (EEG) [5,6]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call