In robotics and human-robot interaction, the ability to express emotions and respond accurately to human emotions is essential. An important aspect of this ability involves controlling the robot’s facial actuators in a way that resonates with human emotions. This study systematically presented the construction of a robot head based on the theoretical foundation of human sustainability. Facial landmarks are extracted for various purposes, such as to determine anthropometric metrics for robot head design, a comprehensive design for the robotic head is proposed in this study including anthropometric measurements used as a theoretical basis for the design, the calculations, and design, control system, electricity, and robot skin are presented. The robotic head with 24 degrees of freedom (DoF) aims to express all human emotions, signals are fed back from strain gauge sensors and cameras to calibrate the response of the robot’s prosthetic muscles. Strain gauge sensors are set up at locations with large deformations to double-check the functioning of prosthetic muscles. The two experiments are set up to evaluate the completeness of emotions, the results showed that both experiments achieved 93.88% and 91.85% respectively.
Read full abstract