- New
- Research Article
- 10.1007/s12369-025-01336-0
- Feb 1, 2026
- International Journal of Social Robotics
- Marcel Finkel + 6 more
- New
- Research Article
- 10.1007/s12369-026-01361-7
- Feb 1, 2026
- International Journal of Social Robotics
- Yi-Chuan Chen + 4 more
- New
- Research Article
- 10.1007/s12369-025-01356-w
- Feb 1, 2026
- International Journal of Social Robotics
- Shikhar Kumar + 2 more
- New
- Research Article
- 10.1007/s12369-026-01360-8
- Feb 1, 2026
- International Journal of Social Robotics
- Wenhao Wang + 3 more
- Research Article
- 10.1007/s12369-025-01337-z
- Jan 1, 2026
- International Journal of Social Robotics
- Oliver Jacobs + 2 more
- Research Article
- 10.1007/s12369-025-01340-4
- Jan 1, 2026
- International Journal of Social Robotics
- Laetitia Tanqueray + 2 more
- Research Article
- 10.1007/s12369-025-01355-x
- Jan 1, 2026
- International Journal of Social Robotics
- Wataru Sato + 3 more
- Research Article
- 10.1007/s12369-025-01335-1
- Jan 1, 2026
- International Journal of Social Robotics
- Ke Xu + 3 more
Abstract Robots are now pervasive, leveraging their automation capabilities to assist humans across a diverse range of tasks. Nevertheless, end-users may have a limited understanding of the robot’s operation and typically assume a passive role when interacting with the robot performing a particular task. In this study, we address the critical need for effective explainability in human-robot interaction. By comparing different methods of explaining robotic scenario information to end-users, the proposed methodologies use a labelled property graph-based chatbot that adheres to the IEEE Robotics Ontology Standards. In this study, we designed two virtual robotic scenarios and simulated their information flow using the Robot Operating System. A between-subjects experiment was conducted where participants engaged with the system through various interaction methods to understand the two scenarios. These methods included real-time Linux Command Line Interface outputs, querying a chatbot, exploring knowledge graphs, or a combination of chatbot and knowledge graphs. The study findings suggest that both the knowledge graphs and the chatbot significantly enhance the system’s explainability compared to a simple Linux terminal information output. Moreover, utilizing knowledge graphs alongside the chatbot has received better subjective evaluations concerning metrics such as clarity, usability, and robustness. This research made contributions towards the development of standardised labelled property graphs for representing scenario information in language-based human-robot interaction. The experiment design and evaluations also provided a solution for assessing the explainability of task-oriented dialogue systems both subjectively and objectively.
- Research Article
- 10.1007/s12369-025-01343-1
- Jan 1, 2026
- International Journal of Social Robotics
- Terran Mott + 3 more
- Research Article
- 10.1007/s12369-026-01358-2
- Jan 1, 2026
- International Journal of Social Robotics
- Lesong Jia + 5 more