Abstract

The legality of autonomous weapon systems (AWS) under international law is a swiftly growing issue of importance as technology advances and machines acquire the capacity to operate without human control. This paper argues that the existing laws are ineffective and that a different set of laws are needed. This paper examines several issues that are critical for the development and use of AWS in warfare. It argues that a preemptive ban on AWS is irrelevant at this point and urges the appropriate authorities to develop a modern legal framework that is tailored to embrace these state-of-the-art weapons as the Law of Armed Conflict (LOAC) develops. First, this paper explores the myriad of laws designed to govern the potential future development and deployment of artificial intelligence and AWS in the context of International Humanitarian Law or LAOC. Second, the paper argues that it will be challenging for AWS to fulfill the requirements laid out under the International Committee of the Red Cross and LOAC for the rules of humanity, military necessity, distinction, proportionality and precaution, especially as it is related to noncombatants. Third, the paper discusses command responsibility and argues that states should establish accountability for wrongful acts committed by the AWS. Finally, this paper contends that there is an urgent need for a new legal framework to regulate these AWS and presents different solutions for the legal framework of AWS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call