Abstract

International Humanitarian Law (IHL) sets the rules to prevent human from doing excessive damages upon humanity in the time of war or armed conflicts. However, a new weapon which is called autonomous weapons rises a serious concern today because it can search, detect, identify, select, track and engage targets without human interventions. This study aims to clarify which weapons are regarded as “autonomous” today in order to find out whether the present autonomous weapons comply the IHL principles. This study adopts normative legal research. The data types used is based on secondary data which consist of Primary legal materials, namely the Geneva Convention 1949 and its Additional Protocols. In addition, secondary legal materials are used to support the primary legal materials are obtained from articles and books. The data is collected through library research and analyzed by using a qualitative-descriptive approach. It finds that a weapon system which limits human control and intervention, is not automatically classified as an autonomous weapon due to the level of human and AI engagement in the weapon. The use of autonomous weapon in armed conflicts does not entirely fulfill the principles of IHL, particularly a fully autonomous weapon because it will never satisfy the principle of distinction, proportionality, the prohibition of attack against those hors de combat and humanity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.