Switch to: References

Add citations

You must login to add citations.
  1. Conversational Artificial Intelligence and the Potential for Epistemic Injustice.Michiel De Proost & Giorgia Pozzi - 2023 - American Journal of Bioethics 23 (5):51-53.
    In their article, Sedlakova and Trachsel (2023) propose a holistic, ethical, and epistemic analysis of conversational artificial intelligence (CAI) in psychotherapeutic settings. They mainly descri...
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Detecting your depression with your smartphone? – An ethical analysis of epistemic injustice in passive self-tracking apps.Mirjam Faissner, Eva Kuhn, Regina Müller & Sebastian Laacke - 2024 - Ethics and Information Technology 26 (2):1-14.
    Smartphone apps might offer a low-threshold approach to the detection of mental health conditions, such as depression. Based on the gathering of ‘passive data,’ some apps generate a user’s ‘digital phenotype,’ compare it to those of users with clinically confirmed depression and issue a warning if a depressive episode is likely. These apps can, thus, serve as epistemic tools for affected users. From an ethical perspective, it is crucial to consider epistemic injustice to promote socially responsible innovations within digital mental (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • From Model Performance to Claim: How a Change of Focus in Machine Learning Replicability Can Help Bridge the Responsibility Gap.Tianqi Kou - manuscript
    Two goals - improving replicability and accountability of Machine Learning research respectively, have accrued much attention from the AI ethics and the Machine Learning community. Despite sharing the measures of improving transparency, the two goals are discussed in different registers - replicability registers with scientific reasoning whereas accountability registers with ethical reasoning. Given the existing challenge of the Responsibility Gap - holding Machine Learning scientists accountable for Machine Learning harms due to them being far from sites of application, this paper (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Socially disruptive technologies and epistemic injustice.J. K. G. Hopster - 2024 - Ethics and Information Technology 26 (1):1-8.
    Recent scholarship on technology-induced ‘conceptual disruption’ has spotlighted the notion of a conceptual gap. Conceptual gaps have also been discussed in scholarship on epistemic injustice, yet up until now these bodies of work have remained disconnected. This article shows that ‘gaps’ of interest to both bodies of literature are closely related, and argues that a joint examination of conceptual disruption and epistemic injustice is fruitful for both fields. I argue that hermeneutical marginalization—a skewed division of hermeneutical resources, which serves to (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Further remarks on testimonial injustice in medical machine learning: a response to commentaries.Giorgia Pozzi - 2023 - Journal of Medical Ethics 49 (8):551-552.
    In my paper entitled ‘Testimonial injustice in medical machine learning’,1 I argued that machine learning (ML)-based Prediction Drug Monitoring Programmes (PDMPs) could infringe on patients’ epistemic and moral standing inflicting a testimonial injustice.2 I am very grateful for all the comments the paper received, some of which expand on it while others take a more critical view. This response addresses two objections raised to my consideration of ML-induced testimonial injustice in order to clarify the position taken in the paper. The (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Dirty data labeled dirt cheap: epistemic injustice in machine learning systems.Gordon Hull - 2023 - Ethics and Information Technology 25 (3):1-14.
    Artificial intelligence (AI) and machine learning (ML) systems increasingly purport to deliver knowledge about people and the world. Unfortunately, they also seem to frequently present results that repeat or magnify biased treatment of racial and other vulnerable minorities. This paper proposes that at least some of the problems with AI’s treatment of minorities can be captured by the concept of epistemic injustice. To substantiate this claim, I argue that (1) pretrial detention and physiognomic AI systems commit testimonial injustice because their (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Who owns NATURE? Conceptual appropriation in discourses on climate and biotechnologies.Jeroen K. G. Hopster, Alessio Gerola, Ben Hofbauer, Guido Löhr, Julia Rijssenbeek & Paulan Korenhof - forthcoming - Environmental Values.
    Emerging technologies can have profound conceptual implications. Their emergence frequently calls for the articulation of new concepts, or for modifications and novel applications of concepts that are already entrenched in communication and thought. In this paper, we introduce the notion of “conceptual appropriation” to capture the dynamics between concepts and emerging technologies. By conceptual appropriation, we mean the novel application of a value-laden concept to lay a contestable claim on an underdetermined phenomenon. We illustrate the dynamics of conceptual appropriation by (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations