Abstract

AbstractIn recent years, the research of humanoid robots that can change users’ opinions has been conducted extensively. In particular, two robots have been found to be able to improve their persuasiveness by cooperating with each other in a sophisticated manner. Previous studies have evaluated the changes in opinions when robots showed consensus building. However, users did not participate in the conversations, and the optimal strategy may change depending on their prior opinions. Therefore, in this study, we developed a system that adaptively changes conversations between robots based on user opinions. We investigate the effect on the change in opinions when the discussion converges to the same position as the user and when it converges to a different position. We conducted two subject experiments in which a user and virtual robotic agents talked to each other using buttons in a crowded setting. The results showed that users with confidence in their opinions increased their confidence when the robot agents’ opinions converged to the same position and decreased their confidence when the robot agents’ opinions converged to a different position. This will significantly contribute to persuasion research using multiple robots and the development of advanced dialogue coordination between robots.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call