Switch to: References

Add citations

You must login to add citations.
  1. Artificial Moral Responsibility: How We Can and Cannot Hold Machines Responsible.Daniel W. Tigard - 2021 - Cambridge Quarterly of Healthcare Ethics 30 (3):435-447.
    Our ability to locate moral responsibility is often thought to be a necessary condition for conducting morally permissible medical practice, engaging in a just war, and other high-stakes endeavors. Yet, with increasing reliance upon artificially intelligent systems, we may be facing a wideningresponsibility gap, which, some argue, cannot be bridged by traditional concepts of responsibility. How then, if at all, can we make use of crucial emerging technologies? According to Colin Allen and Wendell Wallach, the advent of so-called ‘artificial moral (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Autonomous Vehicles and Ethical Settings: Who Should Decide?Paul Formosa - 2022 - In Ryan Jenkins, David Cerny & Tomas Hribek (eds.), Autonomous Vehicle Ethics: The Trolley Problem and Beyond. New York: Oxford University Press.
    While autonomous vehicles (AVs) are not designed to harm people, harming people is an inevitable by-product of their operation. How are AVs to deal ethically with situations where harming people is inevitable? Rather than focus on the much-discussed question of what choices AVs should make, we can also ask the much less discussed question of who gets to decide what AVs should do in such cases. Here there are two key options: AVs with a personal ethics setting (PES) or an (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The Responsibility Gap and LAWS: a Critical Mapping of the Debate.Ann-Katrien Oimann - 2023 - Philosophy and Technology 36 (1):1-22.
    AI has numerous applications and in various fields, including the military domain. The increase in the degree of autonomy in some decision-making systems leads to discussions on the possible future use of lethal autonomous weapons systems (LAWS). A central issue in these discussions is the assignment of moral responsibility for some AI-based outcomes. Several authors claim that the high autonomous capability of such systems leads to a so-called “responsibility gap.” In recent years, there has been a surge in philosophical literature (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Why Command Responsibility May (not) Be a Solution to Address Responsibility Gaps in LAWS.Ann-Katrien Oimann - forthcoming - Criminal Law and Philosophy:1-27.
    The possible future use of lethal autonomous weapons systems (LAWS) and the challenges associated with assigning moral responsibility leads to several debates. Some authors argue that the highly autonomous capability of such systems may lead to a so-called responsibility gap in situations where LAWS cause serious violations of international humanitarian law. One proposed solution is the doctrine of command responsibility. Despite the doctrine’s original development to govern human interactions on the battlefield, it is worth considering whether the doctrine of command (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Correction to: The Responsibility Gap and LAWS: a Critical Mapping of the Debate.Ann-Katrien Oimann - 2023 - Philosophy and Technology 36 (1):1-2.
    AI has numerous applications and in various fields, including the military domain. The increase in the degree of autonomy in some decision-making systems leads to discussions on the possible future use of lethal autonomous weapons systems (LAWS). A central issue in these discussions is the assignment of moral responsibility for some AI-based outcomes. Several authors claim that the high autonomous capability of such systems leads to a so-called “responsibility gap.” In recent years, there has been a surge in philosophical literature (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation