Abstract

Abstract When studying the use of assistive robots in home environments, and especially how such robots can be personalised to meet the needs of the resident, key concerns are issues related to behaviour verification, behaviour interference and safety. Here, personalisation refers to the teaching of new robot behaviours by both technical and non-technical end users. In this article, we consider the issue of behaviour interference caused by situations where newly taught robot behaviours may affect or be affected by existing behaviours and thus, those behaviours will not or might not ever be executed. We focus in particular on how such situations can be detected and presented to the user. We describe the human–robot behaviour teaching system that we developed as well as the formal behaviour checking methods used. The online use of behaviour checking is demonstrated, based on static analysis of behaviours during the operation of the robot, and evaluated in a user study. We conducted a proof-of-concept human–robot interaction study with an autonomous, multi-purpose robot operating within a smart home environment. Twenty participants individually taught the robot behaviours according to instructions they were given, some of which caused interference with other behaviours. A mechanism for detecting behaviour interference provided feedback to participants and suggestions on how to resolve those conflicts. We assessed the participants’ views on detected interference as reported by the behaviour teaching system. Results indicate that interference warnings given to participants during teaching provoked an understanding of the issue. We did not find a significant influence of participants’ technical background. These results highlight a promising path towards verification and validation of assistive home companion robots that allow end-user personalisation.

Highlights

  • A long-term goal of robotics research is the use of assistive robots in the home

  • Software verification of a home companion robot 403 a “smart-home” environment, where the robot is able to extend its capabilities via access to the home sensor network, has been investigated in a number of large-scale projects, e.g. refs. [6,7], motivated by the use of robotic solutions to address the concerns of cost and care issues resulting from an ageing population [8,9]

  • We explored the use of model checkers for the formal verification of the robot behaviours within the Robot House [17,18,19,20]

Read more

Summary

Introduction

A long-term goal of robotics research is the use of assistive robots in the home. Such robots have started to appear in various guises ranging from stationary helpers [1] to cleaning robots [2] to robotic companions [3,4]. Examples include studies focusing on therapeutic and educational outcomes for children with autism, e.g. a 1-month study involving daily HRIs in children’s homes [21], or the year-long deployment of a therapeutic robot used by staff in a special needs nursery school [22]. Results from such “field studies” highlight the importance of usability and a number of challenges have been identified that will influence whether or not people are willing to keep using such a system.

Setting and background
Formal verification and behaviour interference checking
Behaviour interference
The TEACHME system and behaviour interference notification
Teaching system – TEACHME
Behaviour interference detection and reporting
User evaluation
Research questions
Proof-of-concept experiment: methodology
Scenario: new technician
Interfering behaviours
Order of interference detection
SUS responses
Results and discussion
Question 12 of usability questionnaire
Question 11 of usability questionnaire
Gender
Prior interaction with robots and programming experience
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call