Abstract

One of the implications of “fully autonomous” weapons systems (AWS) as an independent decision-maker in the targeting process is that a human-centered paradigm should never be taken for granted. Indeed, they could allow a LOAC debate immune from that paradigm all the more because the underlying “principle of human dignity” has failed to offer convincing reasons for its propriety in international legal discourse. Furthermore, the history of LOAC tells us that the existing human-centered approach to the proportionality test—the commander-centric approach—is, albeit strongly supported and developed by states and international criminal jurisprudence, particularly since the end of the World War II, nothing more than a product of the time. So long as fully AWS exhibit the potential for better contribution to the LOAC goals to protect the victims of armed conflict than human soldiers, one could thus seek an alternative computer-centered approach to the law of targeting—a subset of LOAC—tailored to the defining characteristics of fully AWS in a manner to maximize their potential as well as to make the law more responsive to the needs of ever-changing battlespaces. With this in mind, this chapter aims to relativize the absoluteness of the existing human-centered approach to the proportionality test—which is not to deny the role of humans in the overall regulations of fully AWS whatsoever—and then, away from that approach, to propose an alternative one dedicated to fully AWS for their better regulation in response to the demands of changing times.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call