Abstract
Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.