Abstract
Lethal autonomous weapons systems (LAWS) can operate and select and engage targets independent of direct human control. While automated weapons that presage LAWS, such as the Aegis close-in weapon system (CIWS) have existed for decades, fully autonomous weapons are entering into service. The employment of LAWS during hostilities must comply with the law of armed conflict (LOAC) and the law of naval warfare. Concern over applicability of these rules has raised a campaign by nongovernmental organizations to ban “killer robots.” While an outright ban is not going to occur, debates over the levels of human control required for the lawful employment of LAWS is unfolding in discussions held under the UN Convention on Conventional Weapons through a Group of Governmental Experts (GGE). This debate has centered on whether LAWS should be subject to “meaningful human control” or “appropriate levels of human judgment.” Both terms are ambiguous, however, and differences operationally immaterial. Like all weapons, the use of LAWS is already subject to the legal doctrine of command responsibility and military practice of the commander’s accountability for the methods and means employed during armed conflict.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.