Autonomous Weapons Systems, the Frame Problem and Computer Security

Journal of Military Ethics 14 (2):162-176 (2015)
Download Edit this record How to cite View on PhilPapers
Unlike human soldiers, autonomous weapons systems are unaffected by psychological factors that would cause them to act outside the chain of command. This is a compelling moral justification for their development and eventual deployment in war. To achieve this level of sophistication, the software that runs AWS will have to first solve two problems: the frame problem and the representation problem. Solutions to these problems will inevitably involve complex software. Complex software will create security risks and will make AWS critically vulnerable to hacking. I claim that the political and tactical consequences of hacked AWS far outweigh the purported advantages of AWS not being affected by psychological factors and always following orders. Therefore, one of the moral justifications for the deployment of AWS is undermined
No keywords specified (fix it)
PhilPapers/Archive ID
Upload history
Archival date: 2018-08-15
View other versions
Added to PP index

Total views
288 ( #23,181 of 64,124 )

Recent downloads (6 months)
26 ( #26,920 of 64,124 )

How can I increase my downloads?

Downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.