Abstract
Companion robots play an important role to accompany humans and provide emotional support, such as reducing human social isolation and loneliness. Based on recognizing human partner's mental states, a companion robot is able to dynamically adjust its behaviors, and make human-robot interaction smoother and natural. Human emotion has been recognized by many modalities like facial expression and voice. Neurophysiological signals have shown promising results in emotion recognition, since it is an innate signal of human brain which cannot be faked. In this paper, emotional state recognition using a neurophysiology method is studied to guide and modulate companion-robot navigation to enhance its social capabilities. Electroencephalogram (EEG), a type of neurophysiological signals, is used to recognize human emotional state, and then feed into a navigation path planning algorithm for controlling a companion robot's routes. Simulation results show that mobile robot presents navigation behaviors modulated by dynamic human emotional states.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.