Abstract

It is a truism that, due to human weaknesses, human soldiers have yet to have sufficiently ethical warfare. It is arguable that the likelihood of human soldiers to breach the Principle of Non-Combatant Immunity, for example, is higher in contrast to smart soldiers who are emotionally inept. Hence, this paper examines the possibility that the integration of ethics into smart soldiers will help address moral challenges in modern warfare. The approach is to develop and employ smart soldiers that are enhanced with ethical capabilities. Advocates of this approach think that it is more realistic to make competent entities (i.e., smart soldiers) become morally responsible than to enforce moral responsibility on human soldiers with inherent (moral) limitations. This view somewhat seeks a radical transition from the usual anthropocentric warfare to a robocentric warfare with the belief that the transition has moral advantages. However, the paper defends the claim that despite human limitations, the capacity of ethically enhanced smart soldiers for moral sensitivity is artificial and unauthentic. There are significant problems with the three models of programming ethics into smart soldiers. Also, there are further challenges from the absence of emotion as a moral gauge, and the problems of apportioning responsibility in case of mishap from the actions or omissions of smart soldiers. Among other reasons, the paper takes the replacement of human soldiers as an extreme approach towards an ethical warfare. This replacement predicates ethical complications that outweigh the benefits from the exclusive use of smart soldiers.

Highlights

  • There is a paradigm shift from crude weapons, the medieval weapons to semi-autonomous and to autonomous weapons that are fully integrated with artificial intelligence

  • This paper focuses on the possibility that the integration of ethics into smart soldiers will help address moral challenges in modern warfare

  • enhanced smart soldiers (EeSS) as a replacement for human soldiers is an extreme solution to the problem of an unjust war

Read more

Summary

Introduction

In their historical assessment of weapons, DeVries and Smith (2007, p. vii) rightly comment that “[w]eapons have evolved over time to become both more lethal and more complex.” There is a paradigm shift from crude weapons, the medieval weapons to semi-autonomous and to autonomous weapons that are fully integrated with artificial intelligence. This paper defends the claim that despite human limitations, the capacity of EeSS for moral sensitivity is artificial and unauthentic. This poses some limitations to the exclusive use of the EeSS that I will consider later on. In Section C, I argue that the inclusion of EeSS in modern warfare would complicate moral issues in war rather than produce more ethical warfare. In the concluding section of the paper, I suggest that while EeSS could further complicate moral issues in warfare, it does not diminish their relevance in achieving ethical warfare. It is better to have a collaborative employment of EeSS and human soldiers to achieve ethical warfare

Human limitation and smart capabilities
Section B
EeSS and moral complications
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call