Book contents
- Frontmatter
- Contents
- List of contributors
- Acknowledgements
- PART I Introduction
- PART II Meanings of autonomy and human cognition under automation
- PART III Autonomous weapons systems and human dignity
- 5 Are autonomous weapons systems a threat to human dignity?
- 6 On banning autonomous weapons systems: from deontological to wide consequentialist reasons
- PART IV Risk, transparency and legal compliance in the regulation of autonomous weapons systems
- PART V New frameworks for collective responsibility
- PART VI New frameworks for individual responsibility
- PART VII Conclusion
- Index
6 - On banning autonomous weapons systems: from deontological to wide consequentialist reasons
from PART III - Autonomous weapons systems and human dignity
Published online by Cambridge University Press: 05 August 2016
- Frontmatter
- Contents
- List of contributors
- Acknowledgements
- PART I Introduction
- PART II Meanings of autonomy and human cognition under automation
- PART III Autonomous weapons systems and human dignity
- 5 Are autonomous weapons systems a threat to human dignity?
- 6 On banning autonomous weapons systems: from deontological to wide consequentialist reasons
- PART IV Risk, transparency and legal compliance in the regulation of autonomous weapons systems
- PART V New frameworks for collective responsibility
- PART VI New frameworks for individual responsibility
- PART VII Conclusion
- Index
Summary
Introduction
This chapter examines the ethical reasons supporting a moratorium and, more stringently, a pre-emptive ban on autonomous weapons systems (AWS). Discussions of AWS presuppose a relatively clear idea of what it is that makes those systems autonomous. In this technological context, the relevant type of autonomy is task autonomy, as opposed to personal autonomy, which usually pervades ethical discourse. Accordingly, a weapons system is regarded here as autonomous if it is capable of carrying out the task of selecting and engaging military targets without any human intervention.
Since robotic and artificial intelligence technologies are crucially needed to achieve the required task autonomy in most battlefield scenarios, AWS are identified here with some sort of robotic systems. Thus, ethical issues about AWS are strictly related to technical and epistemological assessments of robotic technologies and systems, at least insofar as the operation of AWS must comply with discrimination and proportionality requirements of international humanitarian law (IHL). A variety of environmental and internal control factors are advanced here as major impediments that prevent both present and foreseeable robotic technologies from meeting IHL discrimination and proportionality demands. These impediments provide overwhelming support for an AWS moratorium – that is, for a suspension of AWS development, production and deployment at least until the technology becomes sufficiently mature with respect to IHL. Discrimination and proportionality requirements, which are usually motivated on deontological grounds by appealing to the fundamental rights of the potential victims, also entail certain moral duties on the part of the battlefield actors. Hence, a moratorium on AWS is additionally supported by a reflection on the proper exercise of these duties – military commanders ought to refuse AWS deployment until the risk of violating IHL is sufficiently low.
Public statements about AWS have often failed to take into account the technical and epistemological assessments of state-of-the-art robotics, which provide support for an AWS moratorium. Notably, some experts of military affairs have failed to convey in their public statements the crucial distinction between the expected short-term outcomes of research programmes on AWS and their more ambitious and distant goals. Ordinary citizens, therefore, are likely to misidentify these public statements as well-founded expert opinions and to develop, as a result, unwarranted beliefs about the technological advancements and unrealistic expectations about IHL-compliant AWS.
- Type
- Chapter
- Information
- Autonomous Weapons SystemsLaw, Ethics, Policy, pp. 122 - 142Publisher: Cambridge University PressPrint publication year: 2016
- 10
- Cited by