Abstract

Background: The use of autonomous weapon systems (AWS) in armed conflict has been rapidly expanding. Consequently, the development of AWS worries legal scholars. If AWS were to operate without ‘meaningful human control’, the violation of international law and human rights would be unpreventable. Methods: This paper indicates that the most important problem arising from the use of AWS is the attribution responsibility for the violation of corporate actors. Nevertheless, it is ambiguous who is legally responsible for these international crimes, thus creating an accountability gap. The main problem regarding corporate responsibility that covers the process of employing AWS is determining who exercises causal control over a chain of acts leading to the crime’s commission. The paper proposes a more optimistic view of artificial intelligence, raising two challenges for corporate responsibility. First, the paper maps the framework of the use of AWS regarding corporate actors. Second, the article identifies the problem of accountability by presenting some possible scenarios linked to the AWS context as a solution to this problem. Results and Conclusions: The results have exposed ambiguity in international law and the absence of essential laws regarding the attribution of responsibility for AWS and the punishment of the perpetrator – international law needs to be improved and regulated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call