Synthese 199 (Suppl 1):177-219 (
2019)
Copy
BIBTEX
Abstract
I present a novel, collaborative, methodology for linguistics: what I call the ‘explanatory economy’. According to this picture, multiple models/theories are evaluated based on the extent to which they complement one another with respect to data coverage. I show how this model can resolve a long-standing worry about the methodology of generative linguistics: that by creating too much distance between data and theory, the empirical credentials of this research program are tarnished. I provide justifications of such methodologically central distinctions as the competence/performance and core/periphery distinction, and then show how we can understand the push for simplicity in the history of generative grammar in this light.