Abstract

Robots are poised to interact with humans in unstructured environments. Despite increasingly robust control algorithms, failure modes arise whenever the underlying dynamics are poorly modeled, especially in unstructured environments. We contribute a set of recovery policies to deal with anomalies produced by external disturbances. The recoveries work when various different types of anomalies are triggered any number of times at any point in the task, including during already running recoveries. Our recovery critic stands atop of a tightly-integrated, graph-based online motion-generation and introspection system. Policies, skills, and introspection models are learned incrementally and contextually over time. Recoveries are studied via a collaborative kitting task where a wide range of anomalous conditions are experienced in the system. We also contribute an extensive analysis of the performance of the tightly integrated anomaly identification, classification, and recovery system under extreme anomalous conditions. We show how the integration of such a system achieves performances greater than the sum of its parts.

Highlights

  • As robots experience increased levels of unstructured environments and shared workspaces with humans, so will the possibility of anomalies and failure

  • The current framework has broad applicability to all manipulation domains that suffer from uncertainties in unstructured environments: making industrial and service robots prime candidates for this technology

  • As recovery is inextricably married to the performance of the anomaly identification and classification systems, it is imperative that a comprehensive evaluation on these subsystems is carried out

Read more

Summary

Introduction

As robots experience increased levels of unstructured environments and shared workspaces with humans, so will the possibility of anomalies and failure. Numerous sources of failure and execution anomaly arise from the complex dynamics found in robots, limited modeling ability, and robot’s interactions with the world. We specify anomalies to be executions whose sensor signatures deviate from a robot’s learned expected models.. Sources of anomaly include: (i) internal errors that can result from improper modeling of visual, kinematic, or dynamic models 51 Page 2 of 40. Anomalous conditions are hard to model as similar anomalies can occur with wide variability, making it challenging for robots to recognize. Recovery system performance will depend significantly on the ability to properly identify and understand the nature of the anomaly

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call