Abstract

Automated warfare including aerial drones that are extensively used in ongoing armed conflicts is now an established part of military technology worldwide. It is only logical to assume that the next step of progression for military technology would be to strengthen the autonomy of weapon systems to the extent that the human is completely removed from the loop. Ongoing discussions at different multilateral fora on the definition of ‘Lethal Autonomous Weapons Systems’ (LAWS), mainly focus on a preemptive ban on LAWS on one hand and the justification of the dual use of technology on the other. Those who advocate for a pre-emptive ban on LAWS argue that such weapon systems are not able to or at least highly unlikely to comply with the fundamental principles of proportionality, distinction and precaution in the International Humanitarian Law (IHL).The issue of accountability and criminal responsibility of individuals and States for serious violations of IHL and IHRL in the use of LAWS is also matter of fundamental importance. Considering the dual use of robot technology and already existing degree of autonomy in current warfare, what is essential is to engage in meaningful deliberations to arrive at common understandings on compliance of LAWS with the principles of IHL and IHRL in their use as an essential prerequisite ahead of their deployment in future conflicts.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call