Abstract
De Neys (2021) argues that the debate between single- and dual-process theorists of thought has become both empirically intractable and scientifically inconsequential. I argue that this is true only under the traditional framing of the debate—when single- and dual-process theories are understood as claims about whether thought processes share the same defining properties (e.g., making mathematical judgments) or have two different defining properties (e.g., making mathematical judgments autonomously versus via access to a central working memory capacity), respectively. But if single- and dual-process theories are understood in cognitive modeling terms as claims about whether thought processes function to implement one or two broad types of algorithms, respectively, then the debate becomes scientifically consequential and, presumably, empirically tractable. So, I argue, the correct response to the current state of the debate is not to abandon it, as De Neys suggests, but to reframe it as a debate about cognitive models.