Abstract
Haru4Kids (H4K) is a system that emulates the physical, social, family-oriented robot Haru, designed with the goal of cohabitating with children in their homes for extended periods of time. In a previous experiment [Garcia GA, Perez G, Levinson L, et al. Living with Haru4Kids: study on children's activity and engagement in a family-robot cohabitation scenario. In: 2023 IEEE ROMAN; 2023 Aug. p. 1428–1435], seven families kept H4K for a span of two weeks in their homes. Throughout this period of cohabitation, we collected child-robot interaction data, including images that were later hand-annotated to estimate user engagement. In this present work, we used a novel AI-based, four-stage framework available from Roboflow for the automatic estimation of children's level of engagement from their inferred emotions. We did a deep study of the performance and behaviour of that framework over our dataset of users' pictures and characterized its response in order to understand its advantages and limitations, including the technique used to translate emotions into engagement levels. We also tested a different approach for that mapping, using a machine learning technique based on Support Vector Machine (SVM). The framework yielded promising results just ‘off-the-shelf’: 0.47–0.68 accuracy and 0.46–0.70 F1 using the original mapping, and 0.39–0.75 and 0.37–0.78 respectively using SVM. Therefore, we propose this emotion-based approach for engagement estimation from pictures.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.