Abstract

This paper presents the findings of a study that used applied ethics to evaluate autonomous robotic systems practically. Using a theoretical tool developed by a team of researchers in 2017, which one of the authors contributed to, we conducted a study of four existing autonomous robotic systems in July 2020. The methods used to carry out the study and the results are highlighted by examining the specific example of ANYmal, an autonomous robotic system that is one component of the CERBERUS team that won first place in DARPA’s Subterranean Challenge Systems Competition in September 2021.

Highlights

  • The four systems we evaluated were as follows: (1) a four-legged system designed to compete in the US Defense Advanced Research Projects Agency (DARPA) subterranean challenge as part of a multi-modal autonomous search operation, named ANYmal; (2) a fixed-wing unmanned aerial vehicle (UAV) designed to autonomously target and intercept other UAVs using a kinetic “net-gun,” named Mobula; (3) a 13-metric ton excavator designed to dig trenches and construct walls autonomously, named Armano; and (4) a quadcopter designed to fly and search for radioactive material in enclosed structures autonomously, named RAD Copter

  • We found that applying our tool to specific autonomous systems in dialogue with roboticists served to discover ethical hotspots and raise awareness on the part of roboticists

  • Through the discussion of our research to develop a practical tool that can be used to analyze the ethical issues presented by autonomous robotic systems, we hope to contribute to a conversation that goes beyond definitions and principles

Read more

Summary

Introduction

Panel of Experts on Libya published in June 2021 the first officially recognized instance of a robotic system using artificial intelligence (AI) to target humans autonomously [1] The development of such technology, known as autonomous weapons systems (AWSs), presents a range of problems related to fundamental notions that international humanitarian law (IHL) is built on [2], respect for human life [3] This paper aims to explain how ethical assessments of autonomous robotic systems are best conducted in practice. This section does not comprehensively present or discuss all the data we gathered but rather some flashpoints and discoveries we made in the ethical assessments It illustrates them by using a quad-pedal autonomous robot called ANYmal, which we assessed. We offer a tool tested in practice to systematically identify ethically and legally high-risk properties of autonomous robotic systems to reach this goal. The act of discussing these risks may be one of the most critical steps in creating technology for the benefit of humanity

Background
Law Applicable to Autonomous Security Systems
Related Ethical Assessment Tools
Our Approach to Evaluating Specific Aspects of Existing Systems
Assessment in Practice
Experience and Lessons Learned
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call