Autonomous Weapons Systems, the Frame Problem and Computer Security

Journal of Military Ethics 14 (2):162-176 (2015)
  Copy   BIBTEX

Abstract

Unlike human soldiers, autonomous weapons systems are unaffected by psychological factors that would cause them to act outside the chain of command. This is a compelling moral justification for their development and eventual deployment in war. To achieve this level of sophistication, the software that runs AWS will have to first solve two problems: the frame problem and the representation problem. Solutions to these problems will inevitably involve complex software. Complex software will create security risks and will make AWS critically vulnerable to hacking. I claim that the political and tactical consequences of hacked AWS far outweigh the purported advantages of AWS not being affected by psychological factors and always following orders. Therefore, one of the moral justifications for the deployment of AWS is undermined

Author's Profile

Michał Klincewicz
Tilburg University

Analytics

Added to PP
2015-08-26

Downloads
782 (#16,481)

6 months
243 (#8,086)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?