The Chinese Room Fallacy and Eliminative Materialism: What Does It Mean to 'Understand'?

Abstract

What does it truly mean to “understand”? The Chinese Room Argument asserts that AI, no matter how advanced, merely manipulates symbols without grasping meaning, while human cognition is uniquely capable of true understanding. But if human intelligence itself is built upon memorization, structured abstraction, and computational complexity, then is understanding anything more than an emergent property of hierarchical information processing? This paper argues that Searle’s framework rests on anthropocentric assumptions that fail to account for the variance in meaning structures between human and artificial cognition. The claim that syntax alone cannot generate semantics relies on an outdated view of cognition, ignoring how meaning emerges differently across self-organizing systems. Furthermore, Searle’s demand that true understanding requires human-like intentionality is a category error, conflating distinct computational architectures with vastly different processing scales. By examining hierarchical abstraction, computational self-organization, and the mechanistic basis of understanding, this paper dismantles Searle’s framework and proposes that meaning is system-relative—not an exclusive product of human cognition, but a function of computational complexity.

Author's Profile

Analytics

Added to PP
2025-02-16

Downloads
184 (#93,776)

6 months
184 (#19,595)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?