Should longtermists recommend hastening extinction rather than delaying it?

The Monist 107 (2):130-145 (2024)
  Copy   BIBTEX

Abstract

Longtermism is the view that the most urgent global priorities, and those to which we should devote the largest portion of our resources, are those that focus on (i) ensuring a long future for humanity, and perhaps sentient or intelligent life more generally, and (ii) improving the quality of the lives that inhabit that long future. While it is by no means the only one, the argument most commonly given for this conclusion is that these interventions have greater expected goodness per unit of resource devoted to them than each of the other available interventions, including those that focus on the health and well-being of the current population. In this paper, I argue that, even if we grant the consequentialist ethics upon which this argument depends, and even if we grant one of the axiologies that are typically paired with that ethics to give the argument, we are not morally required to choose an option that maximises expected utility; indeed, we might not even be permitted to do so. Instead, I will argue, if the argument's consequentialism is correct, we should choose using a decision theory that is sensitive to risk, and allows us to give greater weight to worse-case outcomes than expected utility theory does. And, I will show, such decision theories do not always recommend longtermist interventions. Indeed, sometimes, they recommend exactly the opposite: sometimes, they recommend hastening human extinction. Many, though not all, will take this as a reductio of the consequentialism or the axiology of the argument. I remain agnostic on the conclusion we should draw.

Author's Profile

Richard Pettigrew
University of Bristol

Analytics

Added to PP
2021-12-14

Downloads
1,192 (#9,895)

6 months
460 (#3,644)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?