What We Informationally Owe Each Other

In Algorithms & Autonomy: The Ethics of Automated Decision Systems. Cambridge University Press: Cambridge University Press. pp. 21-42 (forthcoming)
Download Edit this record How to cite View on PhilPapers
ABSTRACT: One important criticism of algorithmic systems is that they lack transparency. Such systems can be opaque because they are complex, protected by patent or trade secret, or deliberately obscure. In the EU, there is a debate about whether the General Data Protection Regulation (GDPR) contains a “right to explanation,” and if so what such a right entails. Our task in this chapter is to address this informational component of algorithmic systems. We argue that information access is integral for respecting autonomy, and transparency policies should be tailored to advance autonomy. To make this argument we distinguish two facets of agency (i.e., capacity to act). The first is practical agency, or the ability to act effectively according to one’s values. The second is what we call cognitive agency, which is the ability to exercise what Pamela Hieronymi calls “evaluative control” (i.e., the ability to control our affective states, such as beliefs, desires, and attitudes). We argue that respecting autonomy requires providing persons sufficient information to exercise evaluative control and properly interpret the world and one’s place in it. We draw this distinction out by considering algorithmic systems used in background checks, and we apply the view to key cases involving risk assessment in criminal justice decisions and K-12 teacher evaluation. The link below is to an open access version of the chapter.
PhilPapers/Archive ID
Upload history
Archival date: 2020-08-04
View other versions
Added to PP index

Total views
188 ( #37,067 of 70,273 )

Recent downloads (6 months)
17 ( #43,444 of 70,273 )

How can I increase my downloads?

Downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.