Deepfakes and the epistemic apocalypse

Synthese 201 (3):1-23 (2023)
  Copy   BIBTEX

Abstract

[Author note: There is a video explainer of this paper on youtube at the new work in philosophy channel (search for surname+deepfakes).] It is widely thought that deepfake videos are a significant and unprecedented threat to our epistemic practices. In some writing about deepfakes, manipulated videos appear as the harbingers of an unprecedented _epistemic apocalypse_. In this paper I want to take a critical look at some of the more catastrophic predictions about deepfake videos. I will argue for three claims: (1) that once we recognise the role of social norms in the epistemology of recordings, deepfakes are much less concerning, (2) that the history of photographic manipulation reveals some important precedents, correcting claims about the novelty of deepfakes, and (3) that proposed solutions to deepfakes have been overly focused on technological interventions. My overall goal is not so much to argue that deepfakes are not a problem, but to argue that behind concerns around deepfakes lie a more general class of social problems about the organisation of our epistemic practices.

Author's Profile

Joshua Habgood-Coote
University of Leeds

Analytics

Added to PP
2023-02-13

Downloads
1,014 (#12,080)

6 months
470 (#3,306)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?