Abstract
As smart factories draw attention due to the 4th industrial revolution, the introduction of collaborative robots into industrial sites is also attracting attention. The introduction of collaborative robots promotes greater interaction between humans and robots and aims for flexibility in work. However, unlike existing industrial robots, collaborative robots work together with robots, so the existence of the robot itself can be perceived as stress by the worker, which can negatively affect work performance. In order to reduce the stress on the existence of the robot itself, humans and robots should be able to interact through various communication, which is called HRI (Human-Robot Interaction). In HRI, robots are expected to be socially intelligent. Being socially intelligent means that robots can understand and respond to human social and emotional cues, allowing them to build natural interactions with humans in many fields, including manufacturing. To make the most of human-robot collaboration, robots should be able to recognize human emotions and mental states and adjust their behavior accordingly through various human sensing modalities (face expression, gestures, voice, EEG, etc.) that are widely used in affective computing. This paper introduces studies that affective computing to cobots for collaboration and interaction between humans and cobots.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.