Switch to: References

Add citations

You must login to add citations.
  1. From tech to tact: emotion dysregulation in online communication during the COVID-19 pandemic.Mark M. James - 2023 - Phenomenology and the Cognitive Sciences (5):1-32.
    Recent theorizing argues that online communication technologies provide powerful, although precarious, means of emotional regulation. We develop this understanding further. Drawing on subjective reports collected during periods of imposed social restrictions under COVID-19, we focus on how this precarity is a source of emo-tional dysregulation. We make our case by organizing responses into five distinct but intersecting dimensions wherein the precarity of this regulation is most relevant: infrastructure, functional use, mindful design (individual and social), and digital tact. Analyzing these reports, (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Affective Artificial Agents as sui generis Affective Artifacts.Marco Facchin & Giacomo Zanotti - forthcoming - Topoi.
    AI-based technologies are increasingly pervasive in a number of contexts. Our affective and emotional life makes no exception. In this article, we analyze one way in which AI-based technologies can affect them. In particular, our investigation will focus on affective artificial agents, namely AI-powered software or robotic agents designed to interact with us in affectively salient ways. We build upon the existing literature on affective artifacts with the aim of providing an original analysis of affective artificial agents and their distinctive (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Therapeutic Chatbots as Cognitive-Affective Artifacts.J. P. Grodniewicz & Mateusz Hohol - forthcoming - Topoi:1-13.
    Conversational Artificial Intelligence (CAI) systems (also known as AI “chatbots”) are among the most promising examples of the use of technology in mental health care. With already millions of users worldwide, CAI is likely to change the landscape of psychological help. Most researchers agree that existing CAIs are not “digital therapists” and using them is not a substitute for psychotherapy delivered by a human. But if they are not therapists, what are they, and what role can they play in mental (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Real feeling and fictional time in human-AI interactions.Krueger Joel & Tom Roberts - forthcoming - Topoi.
    As technology improves, artificial systems are increasingly able to behave in human-like ways: holding a conversation; providing information, advice, and support; or taking on the role of therapist, teacher, or counsellor. This enhanced behavioural complexity, we argue, encourages deeper forms of affective engagement on the part of the human user, with the artificial agent helping to stabilise, subdue, prolong, or intensify a person's emotional condition. Here, we defend a fictionalist account of human/AI interaction, according to which these encounters involve an (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Avatars as Proxies.Paula Sweeney - 2023 - Minds and Machines 33 (3):525-539.
    Avatars will represent us online, in virtual worlds, and in technologically supported hybrid environments. We and our avatars will stand not in an identity relation but in a proxy relation, an arrangement that is significant not least because our proxies’ actions can be counted as our own. However, this proxy relation between humans and avatars is not well understood and its consequences under-explored. In this paper I explore the relation and its potential ethical consequences.
    Download  
     
    Export citation  
     
    Bookmark  
  • The Ethics of ‘Deathbots’.Nora Freya Lindemann - 2022 - Science and Engineering Ethics 28 (6):1-15.
    Recent developments in AI programming allow for new applications: individualized chatbots which mimic the speaking and writing behaviour of one specific living or dead person. ‘Deathbots’, chatbots of the dead, have already been implemented and are currently under development by the first start-up companies. Thus, it is an urgent issue to consider the ethical implications of deathbots. While previous ethical theories of deathbots have always been based on considerations of the dignity of the deceased, I propose to shift the focus (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • The Affective Scaffolding of Grief in the Digital Age: The Case of Deathbots.Regina E. Fabry & Mark Alfano - forthcoming - Topoi:1-13.
    Contemporary and emerging chatbots can be fine-tuned to imitate the style, tenor, and knowledge of a corpus, including the corpus of a particular individual. This makes it possible to build chatbots that imitate people who are no longer alive — deathbots. Such deathbots can be used in many ways, but one prominent way is to facilitate the process of grieving. In this paper, we present a framework that helps make sense of this process. In particular, we argue that deathbots can (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Embracing grief in the age of deathbots: a temporary tool, not a permanent solution.Aorigele Bao & Yi Zeng - 2024 - Ethics and Information Technology 26 (1):1-10.
    “Deathbots,” digital constructs that emulate the conversational patterns, demeanor, and knowledge of deceased individuals. Earlier moral discussions about deathbots centered on the dignity and autonomy of the deceased. This paper primarily examines the potential psychological and emotional dependencies that users might develop towards deathbots, considering approaches to prevent problematic dependence through temporary use. We adopt a hermeneutic method to argue that deathbots, as they currently exist, are unlikely to provide substantial comfort. Lacking the capacity to bear emotional burdens, they fall (...)
    Download  
     
    Export citation  
     
    Bookmark