Abstract
This Article examines the rise of autonomous weapons systems (AWS), their transformative effect on warfare, and the novel and difficult challenges that autonomy will pose to current interpretations and applications of international humanitarian law (IHL), the legal framework applicable in armed conflict. Autonomous weapons will profoundly affect humans’ role in warfare. AWS will be able to make certain decisions that have traditionally been made exclusively by humans. They will also affect how humans exercise judgment over uses of force as commanders are further removed (temporally and geographically) from the point of kinetic action. At the same time, human judgment cannot be excised from warfare and must remain the focus of IHL. This Article analyzes that tension and the critical need to reconcile IHL’s focus on human decision-making with autonomous technology. This Article provides the first comprehensive analysis of how fundamental principles of IHL can and should be applied in light of rapidly changing advances in warfare and their correlated humanitarian risks. In particular, it critically assesses how IHL rules regarding weapons development, targeting, and accountability can be applied (and the interpretive challenges we confront in doing so) with respect to the use of AWS in armed conflict.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.