Switch to: References

Add citations

You must login to add citations.
  1. Artificial intelligence and responsibility gaps: what is the problem?Peter Königs - 2022 - Ethics and Information Technology 24 (3):1-11.
    Recent decades have witnessed tremendous progress in artificial intelligence and in the development of autonomous systems that rely on artificial intelligence. Critics, however, have pointed to the difficulty of allocating responsibility for the actions of an autonomous system, especially when the autonomous system causes harm or damage. The highly autonomous behavior of such systems, for which neither the programmer, the manufacturer, nor the operator seems to be responsible, has been suspected to generate responsibility gaps. This has been the cause of (...)
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  • A Comparative Defense of Self-initiated Prospective Moral Answerability for Autonomous Robot harm.Marc Champagne & Ryan Tonkens - 2023 - Science and Engineering Ethics 29 (4):1-26.
    As artificial intelligence becomes more sophisticated and robots approach autonomous decision-making, debates about how to assign moral responsibility have gained importance, urgency, and sophistication. Answering Stenseke’s (2022a) call for scaffolds that can help us classify views and commitments, we think the current debate space can be represented hierarchically, as answers to key questions. We use the resulting taxonomy of five stances to differentiate—and defend—what is known as the “blank check” proposal. According to this proposal, a person activating a robot could (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • A Moral Bind? — Autonomous Weapons, Moral Responsibility, and Institutional Reality.Bartek Chomanski - 2023 - Philosophy and Technology 36.
    In “Accepting Moral Responsibility for the Actions of Autonomous Weapons Systems—a Moral Gambit” (2022), Mariarosaria Taddeo and Alexander Blanchard answer one of the most vexing issues in current ethics of technology: how to close the so-called “responsibility gap”? Their solution is to require that autonomous weapons systems (AWSs) may only be used if there is some human being who accepts the ex ante responsibility for those actions of the AWS that could not have been predicted or intended (in such cases, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • “All AIs are Psychopaths”? The Scope and Impact of a Popular Analogy.Elina Nerantzi - 2025 - Philosophy and Technology 38 (1):1-24.
    Artificial Intelligence (AI) Agents are often compared to psychopaths in popular news articles. The headlines are ‘eye-catching’, but the questions of what this analogy means or why it matters are hardly answered. The aim of this paper is to take this popular analogy ‘seriously’. By that, I mean two things. First, I aim to explore the scope of this analogy, i.e. to identify and analyse the shared properties of AI agents and psychopaths, namely, their lack of moral emotions and their (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • A Moral Bind? — Autonomous Weapons, Moral Responsibility, and Institutional Reality.Bartlomiej Chomanski - 2023 - Philosophy and Technology 36 (2):1-14.
    In “Accepting Moral Responsibility for the Actions of Autonomous Weapons Systems—a Moral Gambit” (2022), Mariarosaria Taddeo and Alexander Blanchard answer one of the most vexing issues in current ethics of technology: how to close the so-called “responsibility gap”? Their solution is to require that autonomous weapons systems (AWSs) may only be used if there is some human being who accepts the ex ante responsibility for those actions of the AWS that could not have been predicted or intended (in such cases, (...)
    Download  
     
    Export citation  
     
    Bookmark