Abstract

Human-robot interaction (HRI) is a relatively new field of study. To date, most of the effort in robotics has been spent in developing hardware and software that expands the range of robot functionality and autonomy. In contrast, little effort has been spent so far to ensure that the robotic displays and interaction controls are intuitive for humans. This study applied robotics, human-computer interaction (HCI), and computer-supported cooperative work (CSCW) expertise to gain experience with HCI/CSCW evaluation techniques in the robotics domain. As a case study for this article, we analyzed four different robot systems that competed in the 2002 American Association for Artificial Intelligence Robot Rescue Competition. These systems completed urban search and rescue tasks in a controlled environment with predetermined scoring rules that provided objective measures of success. This study analyzed pre-evaluation questionnaires; videotapes of the robots, interfaces, and operators; maps of the robots' paths through the competition arena; post-evaluation debriefings; and critical incidents (e.g., when the robots damaged the test arena). As a result, this study developed guidelines for developing interfaces for HRI.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.