Abstract

Automated vehicles (AVs) are expected to operate on public roads, together with non-automated vehicles and other road users such as pedestrians or bicycles. Recent ethical reports and guidelines raise worries that AVs will introduce injustice or reinforce existing social inequalities in road traffic. One major injustice concern in today’s traffic is that different types of road users are exposed differently to risks of corporal harm. In the first part of the paper, we discuss the responsibility of AV developers to address existing injustice concerns regarding risk exposure as well as approaches on how to fulfill the responsibility for a fairer distribution of risk. In contrast to popular approaches on the ethics of risk distribution in unavoidable accident cases, we focus on low and moderate risk situations, referred to as routine driving. For routine driving, the obligation to distribute risks fairly must be discussed in the context of risk-taking and risk-acceptance, balancing safety objectives of occupants and other road users with driving utility. In the second part of the paper, we present a typical architecture for decentralized automated driving which contains a dedicated module for real-time risk estimation and management. We examine how risk estimation modules can be adjusted and parameterized to redress some inequalities.

Highlights

  • Automated vehicles (AVs) are an emerging technology which has raised intense ethical questions, both in the academic world and the general public

  • In contrast to many previous approaches, which look at risk distribution and its ethical justification in accident scenarios (Di Fabio, Broy, & Brüngger, 2017; Goodall, 2017; Schäffner, 2020), we focus on how AVs ought to decide in situations of low and moderate risk, considered as routine or general driving

  • A dedicated risk estimation together with a behavior planning module allows to make explicit choices about how the AV occupants and other road users are exposed to risk in each driving scene

Read more

Summary

Introduction

Automated vehicles (AVs) are an emerging technology which has raised intense ethical questions, both in the academic world and the general public. In line with the recent Ethics of Connected and Automated Vehicles report (Horizon 2020 Commission Expert Group, 2020), addressing such concerns is the responsibility of developers and deployers. We elaborate on the developer/deployer perspective and first discuss their responsibility in addressing risk exposure injustice by considering risk ethics and institutional justice.. For routine driving the obligation to distribute risks ethically must be viewed in the context of risk-taking and risk-acceptance, balancing safety objectives of occupants and other road users with driving utility. This will be covered in the first part of the paper Since AVs will likely enter the roads as one entity of a fleet of vehicles of the same

We will refer to responsibility in its forward-looking understanding
Findings
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call