Computer says "No": The Case Against Empathetic Conversational AI

Findings of the Association for Computational Linguistics: Acl 2023 (2023)
  Copy   BIBTEX

Abstract

Emotions are an integral part of human cognition and they guide not only our understanding of the world but also our actions within it. As such, whether we soothe or flame an emotion is not inconsequential. Recent work in conversational AI has focused on responding empathetically to users, validating and soothing their emotions without a real basis. This AI-aided emotional regulation can have negative consequences for users and society, tending towards a one-noted happiness defined as only the absence of "negative" emotions. We argue that we must carefully consider whether and how to respond to users' emotions.

Author's Profile

Alba Curry
University of Leeds

Analytics

Added to PP
2023-04-15

Downloads
285 (#72,184)

6 months
139 (#30,943)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?