Consciousness without biology: An argument from anticipating scientific progress

Abstract

I develop the anticipatory argument for the view that it is nomologically possible that some non-biological creatures are phenomenally conscious, including conventional, silicon-based AI systems. This argument rests on the general idea that we should make our beliefs conform to the outcomes of an ideal scientific process and that such an ideal scientific process would attribute consciousness to some possible AI systems. This kind of ideal scientific process is an ideal application of the iterative natural kind (INK) strategy, according to which one should investigate consciousness by treating it as a natural kind which iteratively explains observable patterns and correlations between potentially consciousness-relevant features. The relevant AI systems are psychological duplicates. These are hypothetical non-biological creatures which share the coarse-grained functional organization of humans. I argue that an ideal application of the INK strategy would attribute consciousness to psychological duplicates because this gives rise to a simpler and more unifying explanatory account of biological and non-biological cognition. If my argument is sound, then creatures made from the same material as conventional AI systems can be conscious, thus removing one of the main uncertainties for assessing AI consciousness and suggesting that AI consciousness may be a serious near-term concern.

Author's Profile

Leonard Dung
Ruhr-Universität Bochum

Analytics

Added to PP
2024-10-31

Downloads
324 (#70,904)

6 months
324 (#4,965)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?