Abstract

Emotion integrates different aspects of a person, including mood (current emotional state), personality, voice or speech, color around the eyes, and facial organs' movement. We are considering the mood because a person’s current emotional state must always affect upcoming emotions. So behind an emotion, all these parameters are involved, and a human being can easily recognize it by seeing that face even if more than one person is there, so for the robot to make human-like emotion, all these parameters have to be considered to imitate artificial facial expression against that emotion. Most researchers working in this area still find difficulties in determining exact emotion by the robot because facial information is not always available, especially when interacting with a group of people and mimicking exact emotion that the user can effectively recognise. In our study, the loud most speeches among the people sensed by the robot and color around eyes are considered to cope with these issues. Another issue is the rise time and fall time of emotional intensity. In other words, how long should the robot keep an emotion here? An experimental approach is applied to get these values. The proposed method used an emotional speech database to recognize the human emotion using convunational neural network (CNN) and RGB patterns to mimic the emotion, which simulates an improved humanoid robot that can express emotion like human beings and give real-time responses to the user or group of users that can make more effective Human-Robot Interaction (HRI).

Highlights

  • The human face is very special in different aspects; one of those aspects is expressing emotion

  • When we talk about Human-Robot Interaction (HRI), it becomes challenging for the humanoid robot to determine the exact emotion expressed by the person who is interacting at a particular moment, especially when interacting with a group of people and when emotion is not evident with the face because according to psychologist numerous types of expression can be produced by a human

  • To interact with a humanoid robot, HRI is very important; studies of human-robot interaction will be improved by automated emotion interpretation

Read more

Summary

INTRODUCTION

The human face is very special in different aspects; one of those aspects is expressing emotion. When we talk about HRI, it becomes challenging for the humanoid robot to determine the exact emotion expressed by the person (human) who is interacting at a particular moment, especially when interacting with a group of people and when emotion is not evident with the face because according to psychologist numerous types of expression can be produced by a human. The robot can recognize the user emotional state and would respond . That could mean that if the user is happy, the robot should behave like if it is happy, which would improve the interaction between a human and a machine. The problem arises for the robot to communicate with a user whose expression is not clear on his face, especially when people are involved. The rest of the paper organized as the related work section is following this section the complete methodology is presented, the result and discussion section is presented following the methodology and the work was concluded

LITERATURE REVIEW
RGB Pattern Recognition
RGB Pattern Evaluation
AND DISCUSSION
CONCLUSION AND FUTURE WORK
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call