Abstract

Abstract This article discusses an important limitation on the degree of autonomy that may permissibly be afforded to autonomous weapon systems (AWS) in the context of an armed conflict: the extent to which international humanitarian law (IHL) requires that human beings be able to intervene directly in the operation of weapon systems in the course of an attack. As there is currently no conventional or customary law directed specifically at AWS, limits on use of autonomous capabilities in weapons, if any exist, must be inferred from the principles, rules and goals of general IHL. The process adopted herein is to look for two broad types of limitations: those which take the form of maximum permissible degrees of machine involvement in regulated activities, and those which take the form of minimum permissible degrees of human involvement. The article’s main finding is that while existing law does not impose limits of the first type, it does impose some of the second type. Specifically, legal obligations borne by individuals (commanders in charge of AWS operations, weapon system operators and others) determine the required minimum capacity for direct human intervention. The article further suggests means by which the required degree of human intervention may be determined in specific circumstances.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.