What We Informationally Owe Each Other

In Algorithms & Autonomy: The Ethics of Automated Decision Systems. Cambridge University Press: Cambridge University Press. pp. 21-42 (forthcoming)
  Copy   BIBTEX

Abstract

ABSTRACT: One important criticism of algorithmic systems is that they lack transparency. Such systems can be opaque because they are complex, protected by patent or trade secret, or deliberately obscure. In the EU, there is a debate about whether the General Data Protection Regulation (GDPR) contains a “right to explanation,” and if so what such a right entails. Our task in this chapter is to address this informational component of algorithmic systems. We argue that information access is integral for respecting autonomy, and transparency policies should be tailored to advance autonomy. To make this argument we distinguish two facets of agency (i.e., capacity to act). The first is practical agency, or the ability to act effectively according to one’s values. The second is what we call cognitive agency, which is the ability to exercise what Pamela Hieronymi calls “evaluative control” (i.e., the ability to control our affective states, such as beliefs, desires, and attitudes). We argue that respecting autonomy requires providing persons sufficient information to exercise evaluative control and properly interpret the world and one’s place in it. We draw this distinction out by considering algorithmic systems used in background checks, and we apply the view to key cases involving risk assessment in criminal justice decisions and K-12 teacher evaluation. The link below is to an open access version of the chapter.

Author Profiles

Clinton Castro
University of Wisconsin, Madison
Alan Rubel
University of Wisconsin, Madison

Analytics

Added to PP
2020-08-04

Downloads
388 (#41,446)

6 months
74 (#54,477)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?