Abstract

In this study, we analyzed the facial expressions of the participants from a study where 63 students were learning about Climate Change using Betty’s Brain, half with robotic agents and the other half with on-screen based agents using a Facial Expression Recognition (FER) system. Among FER systems, upon a review of existing offline open source solutions, we chose HyperExtended LightFace (more known as deepface) to extract emotions from the participants’ facial expressions. Based on the extracted emotions from the FER system, we compared the resulting emotions between two groups to investigate if there are differences in displayed emotions due to human robot interaction. The result of this first analysis shows that learners who interacted with robotic agents express fear, happy and neutral emotions more than the students who interacted with on-screen based agents. In addition, neutral emotion was detected more in the robotic condition with a large effect, fear with moderate effect and happy with small effect size. Fear and happiness was 60% correlated, which indicates that the FER system may not distinguish well between these two emotions. The future work entails filtering and validation of face detection results in case of false detection and utilizing different pre-processing methods to detect the face and using different FER algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call