16 found
Order:
  1. What Should We Agree on about the Repugnant Conclusion?Stephane Zuber, Nikhil Venkatesh, Torbjörn Tännsjö, Christian Tarsney, H. Orri Stefánsson, Katie Steele, Dean Spears, Jeff Sebo, Marcus Pivato, Toby Ord, Yew-Kwang Ng, Michal Masny, William MacAskill, Nicholas Lawson, Kevin Kuruc, Michelle Hutchinson, Johan E. Gustafsson, Hilary Greaves, Lisa Forsberg, Marc Fleurbaey, Diane Coffey, Susumu Cato, Clinton Castro, Tim Campbell, Mark Budolfson, John Broome, Alexander Berger, Nick Beckstead & Geir B. Asheim - 2021 - Utilitas 33 (4):379-383.
    The Repugnant Conclusion served an important purpose in catalyzing and inspiring the pioneering stage of population ethics research. We believe, however, that the Repugnant Conclusion now receives too much focus. Avoiding the Repugnant Conclusion should no longer be the central goal driving population ethics research, despite its importance to the fundamental accomplishments of the existing literature.
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  2. Is there a Duty to Be a Digital Minimalist?Timothy Aylsworth & Clinton Castro - 2021 - Journal of Applied Philosophy 38 (4):662-673.
    The harms associated with wireless mobile devices (e.g. smartphones) are well documented. They have been linked to anxiety, depression, diminished attention span, sleep disturbance, and decreased relationship satisfaction. Perhaps what is most worrying from a moral perspective, however, is the effect these devices can have on our autonomy. In this article, we argue that there is an obligation to foster and safeguard autonomy in ourselves, and we suggest that wireless mobile devices pose a serious threat to our capacity to fulfill (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  3. Egalitarian Machine Learning.Clinton Castro, David O’Brien & Ben Schwan - 2023 - Res Publica 29 (2):237–264.
    Prediction-based decisions, which are often made by utilizing the tools of machine learning, influence nearly all facets of modern life. Ethical concerns about this widespread practice have given rise to the field of fair machine learning and a number of fairness measures, mathematically precise definitions of fairness that purport to determine whether a given prediction-based decision system is fair. Following Reuben Binns (2017), we take ‘fairness’ in this context to be a placeholder for a variety of normative egalitarian considerations. We (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  4. Agency Laundering and Information Technologies.Alan Rubel, Clinton Castro & Adam Pham - 2019 - Ethical Theory and Moral Practice 22 (4):1017-1041.
    When agents insert technological systems into their decision-making processes, they can obscure moral responsibility for the results. This can give rise to a distinct moral wrong, which we call “agency laundering.” At root, agency laundering involves obfuscating one’s moral responsibility by enlisting a technology or process to take some action and letting it forestall others from demanding an account for bad outcomes that result. We argue that the concept of agency laundering helps in understanding important moral problems in a number (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  5. Just Machines.Clinton Castro - 2022 - Public Affairs Quarterly 36 (2):163-183.
    A number of findings in the field of machine learning have given rise to questions about what it means for automated scoring- or decisionmaking systems to be fair. One center of gravity in this discussion is whether such systems ought to satisfy classification parity (which requires parity in accuracy across groups, defined by protected attributes) or calibration (which requires similar predictions to have similar meanings across groups, defined by protected attributes). Central to this discussion are impossibility results, owed to Kleinberg (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  6. The Fair Chances in Algorithmic Fairness: A Response to Holm.Clinton Castro & Michele Loi - 2023 - Res Publica 29 (2):231–237.
    Holm (2022) argues that a class of algorithmic fairness measures, that he refers to as the ‘performance parity criteria’, can be understood as applications of John Broome’s Fairness Principle. We argue that the performance parity criteria cannot be read this way. This is because in the relevant context, the Fairness Principle requires the equalization of actual individuals’ individual-level chances of obtaining some good (such as an accurate prediction from a predictive system), but the performance parity criteria do not guarantee any (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  7. On the Duty to Be an Attention Ecologist.Tim Aylsworth & Clinton Castro - 2022 - Philosophy and Technology 35 (1):1-22.
    The attention economy — the market where consumers’ attention is exchanged for goods and services — poses a variety of threats to individuals’ autonomy, which, at minimum, involves the ability to set and pursue ends for oneself. It has been argued that the threat wireless mobile devices pose to autonomy gives rise to a duty to oneself to be a digital minimalist, one whose interactions with digital technologies are intentional such that they do not conflict with their ends. In this (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  8. Algorithms, Agency, and Respect for Persons.Alan Rubel, Clinton Castro & Adam Pham - 2020 - Social Theory and Practice 46 (3):547-572.
    Algorithmic systems and predictive analytics play an increasingly important role in various aspects of modern life. Scholarship on the moral ramifications of such systems is in its early stages, and much of it focuses on bias and harm. This paper argues that in understanding the moral salience of algorithmic systems it is essential to understand the relation between algorithms, autonomy, and agency. We draw on several recent cases in criminal sentencing and K–12 teacher evaluation to outline four key ways in (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  9. Democratic Obligations and Technological Threats to Legitimacy: PredPol, Cambridge Analytica, and Internet Research Agency.Alan Rubel, Clinton Castro & Adam Pham - 2021 - In Alan Rubel, Clinton Castro & Adam Pham (eds.), Algorithms and Autonomy: The Ethics of Automated Decision Systems. Cambridge University Press. pp. 163-183.
    ABSTRACT: So far in this book, we have examined algorithmic decision systems from three autonomy-based perspectives: in terms of what we owe autonomous agents (chapters 3 and 4), in terms of the conditions required for people to act autonomously (chapters 5 and 6), and in terms of the responsibilities of agents (chapter 7). -/- In this chapter we turn to the ways in which autonomy underwrites democratic governance. Political authority, which is to say the ability of a government to exercise (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  10. Kantian Ethics and the Attention Economy.Timothy Aylsworth & Clinton Castro - 2024 - Palgrave Macmillan.
    In this open access book, Timothy Aylsworth and Clinton Castro draw on the deep well of Kantian ethics to argue that we have moral duties, both to ourselves and to others, to protect our autonomy from the threat posed by the problematic use of technology. The problematic use of technologies like smartphones threatens our autonomy in a variety of ways, and critics have only begun to appreciate the vast scope of this problem. In the last decade, we have seen a (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  11. Should I Use ChatGPT to Write My Papers?Aylsworth Timothy & Clinton Castro - 2024 - Philosophy and Technology 37 (117):1-28.
    We argue that students have moral reasons to refrain from using chatbots such as ChatGPT to write certain papers. We begin by showing why many putative reasons to refrain from using chatbots fail to generate compelling arguments against their use in the construction of these papers. Many of these reasons rest on implausible principles, hollowed out conceptions of education, or impoverished accounts of human agency. They also overextend to cases where it is permissible to rely on a machine for something (...)
    Download  
     
    Export citation  
     
    Bookmark  
  12. Epistemic Paternalism Online.Clinton Castro, Adam Pham & Alan Rubel - 2020 - In Guy Axtell & Amiel Bernal (eds.), Epistemic Paternalism: Conceptions, Justifications and Implications. Lanham, Md: Rowman & Littlefield International. pp. 29-44.
    New media (highly interactive digital technology for creating, sharing, and consuming information) affords users a great deal of control over their informational diets. As a result, many users of new media unwittingly encapsulate themselves in epistemic bubbles (epistemic structures, such as highly personalized news feeds, that leave relevant sources of information out (Nguyen forthcoming)). Epistemically paternalistic alterations to new media technologies could be made to pop at least some epistemic bubbles. We examine one such alteration that Facebook has made in (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  13. Social Media, Emergent Manipulation, and Political Legitimacy.Adam Pham, Alan Rubel & Clinton Castro - 2022 - In Michael Klenk & Fleur Jongepier (eds.), The Philosophy of Online Manipulation. Routledge. pp. 353-369.
    Psychometrics firms such as Cambridge Analytica (CA) and troll factories such as the Internet Research Agency (IRA) have had a significant effect on democratic politics, through narrow targeting of political advertising (CA) and concerted disinformation campaigns on social media (IRA) (U.S. Department of Justice 2019; Select Committee on Intelligence, United States Senate 2019; DiResta et al. 2019). It is natural to think that such activities manipulate individuals and, hence, are wrong. Yet, as some recent cases illustrate, the moral concerns with (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  14. The Duty to Promote Digital Minimalism in Group Agents.Timothy Aylsworth & Clinton Castro - 2024 - In Timothy Aylsworth & Clinton Castro (eds.), Kantian Ethics and the Attention Economy. Palgrave Macmillan.
    In this chapter, we turn our attention to the effects of the attention economy on our ability to act autonomously as a group. We begin by clarifying which sorts of groups we are concerned with, which are structured groups (groups sufficiently organized that it makes sense to attribute agency to the group itself). Drawing on recent work by Purves and Davis (2022), we describe the essential roles of trust (i.e., depending on groups to fulfill their commitments) and trustworthiness (i.e., the (...)
    Download  
     
    Export citation  
     
    Bookmark  
  15. Does Predictive Sentencing Make Sense?Clinton Castro, Alan Rubel & Lindsey Schwartz - forthcoming - Inquiry: An Interdisciplinary Journal of Philosophy.
    This paper examines the practice of using predictive systems to lengthen the prison sentences of convicted persons when the systems forecast a higher likelihood of re-offense or re-arrest. There has been much critical discussion of technologies used for sentencing, including questions of bias and opacity. However, there hasn’t been a discussion of whether this use of predictive systems makes sense in the first place. We argue that it does not by showing that there is no plausible theory of punishment that (...)
    Download  
     
    Export citation  
     
    Bookmark  
  16. What We Informationally Owe Each Other.Alan Rubel, Clinton Castro & Adam Pham - 2021 - In Alan Rubel, Clinton Castro & Adam Pham (eds.), Algorithms and Autonomy: The Ethics of Automated Decision Systems. Cambridge University Press. pp. 21-42.
    ABSTRACT: One important criticism of algorithmic systems is that they lack transparency. Such systems can be opaque because they are complex, protected by patent or trade secret, or deliberately obscure. In the EU, there is a debate about whether the General Data Protection Regulation (GDPR) contains a “right to explanation,” and if so what such a right entails. Our task in this chapter is to address this informational component of algorithmic systems. We argue that information access is integral for respecting (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation