Abstract

Colleagues often shake hands in greeting, friends connect through high fives, and children around the world rejoice in hand-clapping games. As robots become more common in everyday human life, they will have the opportunity to join in these social-physical interactions, but few current robots are intended to touch people in friendly ways. This article describes how we enabled a Baxter Research Robot to both teach and learn bimanual hand-clapping games with a human partner. Our system monitors the user's motions via a pair of inertial measurement units (IMUs) worn on the wrists. We recorded a labeled library of 10 common hand-clapping movements from 10 participants; this dataset was used to train an SVM classifier to automatically identify hand-clapping motions from previously unseen participants with a test-set classification accuracy of 97.0%. Baxter uses these sensors and this classifier to quickly identify the motions of its human gameplay partner, so that it can join in hand-clapping games. This system was evaluated by N = 24 naïve users in an experiment that involved learning sequences of eight motions from Baxter, teaching Baxter eight-motion game patterns, and completing a free interaction period. The motion classification accuracy in this less structured setting was 85.9%, primarily due to unexpected variations in motion timing. The quantitative task performance results and qualitative participant survey responses showed that learning games from Baxter was significantly easier than teaching games to Baxter, and that the teaching role caused users to consider more teamwork aspects of the gameplay. Over the course of the experiment, people felt more understood by Baxter and became more willing to follow the example of the robot. Users felt uniformly safe interacting with Baxter, and they expressed positive opinions of Baxter and reported fun interacting with the robot. Taken together, the results indicate that this robot achieved credible social-physical interaction with humans and that its ability to both lead and follow systematically changed the human partner's experience.

Highlights

  • As robot use expands from independent operation in factories to cooperative responsibilities in environments like hospitals and schools, social skills become an increasingly important factor for robot developers to consider

  • As an initial foray into equipping robots with social-physical human-robot interaction skills, we chose to investigate human-robot handto-hand contact during playful hand-clapping games like “Pat-a-cake” and “Slide.” We prepared to run this study by connecting our past work on classifying human handclapping motions recorded via inertial sensors (Fitter and Kuchenbecker, 2016c) with our previously developed methods for making a robot clap hands in human-like ways (Fitter and Kuchenbecker, 2016b)

  • We extracted a feature set composed of basic statistical measures from each x, y, and z-axis channel of the accelerometer and gyroscope, the root-mean square (RMS) acceleration for each hand, and highand low-pass filtered data from each of these channels

Read more

Summary

Introduction

As robot use expands from independent operation in factories to cooperative responsibilities in environments like hospitals and schools, social skills become an increasingly important factor for robot developers to consider. As an initial foray into equipping robots with social-physical human-robot interaction (spHRI) skills, we chose to investigate human-robot handto-hand contact during playful hand-clapping games like “Pat-a-cake” and “Slide.” We prepared to run this study by connecting our past work on classifying human handclapping motions recorded via inertial sensors (Fitter and Kuchenbecker, 2016c) with our previously developed methods for making a robot clap hands in human-like ways (Fitter and Kuchenbecker, 2016b). The result of this union is sensormediated human-robot interaction (HRI) during which each participant (the human and the robot) physically mimics the movements of the other one at different times during the game

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.