Autonomous Weapons Systems, the Frame Problem and Computer Security

Journal of Military Ethics 14 (2):162-176 (2015)
Download Edit this record How to cite View on PhilPapers
Abstract
Unlike human soldiers, autonomous weapons systems are unaffected by psychological factors that would cause them to act outside the chain of command. This is a compelling moral justification for their development and eventual deployment in war. To achieve this level of sophistication, the software that runs AWS will have to first solve two problems: the frame problem and the representation problem. Solutions to these problems will inevitably involve complex software. Complex software will create security risks and will make AWS critically vulnerable to hacking. I claim that the political and tactical consequences of hacked AWS far outweigh the purported advantages of AWS not being affected by psychological factors and always following orders. Therefore, one of the moral justifications for the deployment of AWS is undermined
Keywords
No keywords specified (fix it)
Categories
PhilPapers/Archive ID
KLIAWS
Upload history
Archival date: 2018-08-15
View other versions
Added to PP index
2015-08-26

Total views
171 ( #25,859 of 53,030 )

Recent downloads (6 months)
40 ( #15,880 of 53,030 )

How can I increase my downloads?

Downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.