Abstract

Background: The use of autonomous weapon systems (AWS) in armed conflict has been rapidly expanding. Consequently, the development of AWS worries legal scholars. If AWS were to operate without ‘meaningful human control’, the violation of international law and human rights would be unpreventable. Methods: This paper indicates that the most important problem arising from the use of AWS is the attribution responsibility for the violation of corporate actors. Nevertheless, it is ambiguous who is legally responsible for these international crimes, thus creating an accountability gap. The main problem regarding corporate responsibility that covers the process of employing AWS is determining who exercises causal control over a chain of acts leading to the crime’s commission. The paper proposes a more optimistic view of artificial intelligence, raising two challenges for corporate responsibility. First, the paper maps the framework of the use of AWS regarding corporate actors. Second, the article identifies the problem of accountability by presenting some possible scenarios linked to the AWS context as a solution to this problem. Results and Conclusions: The results have exposed ambiguity in international law and the absence of essential laws regarding the attribution of responsibility for AWS and the punishment of the perpetrator – international law needs to be improved and regulated.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.