Abstract

The militarisation of Artificial Intelligence Diplomacy has resulted in the development of heavy weapons that are more powerful than traditional weaponry, fail to distinguish between civilians and combatants, and cause unnecessary suffering. Superpowers and middle powers have made significant investments in digital technologies, resulting in the production of digital weapons that violate international humanitarian law and human rights standards, and complicate the achievement of global peace. Armed drones and militarised robots cause unnecessary pain and suffering to helpless civilians. These weapons have been used to combat terrorism, but, surprisingly, have not addressed issues of terrorism that affect post-Cold War international relations. As a result, the use of armed drones is causing more harm than is necessary to achieve the objective of war. There is a call for international artificial intelligence (AI) governance, as well as a need to understand the effects and serious threats that armed drones pose to international humanitarian law (IHL), as well as to peace processes in international relations and global cooperation. Scholars, policy-makers, human rights activists and peace practitioners should participate more actively in debates about the military application of AI diplomacy, in order to develop effective AI diplomacy rules and regulations. This serves to mitigate the risks and threats associated with armed drones on IHL and international human rights standards, which are the foundations of the post-modern world.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call