Switch to: References

Add citations

You must login to add citations.
  1. Worlds, Models and Descriptions.John F. Sowa - 2006 - Studia Logica 84 (2):323-360.
    Since the pioneering work by Kripke and Montague, the term possible world has appeared in most theories of formal semantics for modal logics, natural languages, and knowledge-based systems. Yet that term obscures many questions about the relationships between the real world, various models of the world, and descriptions of those models in either formal languages or natural languages. Each step in that progression is an abstraction from the overwhelming complexity of the world. At the end, nothing is left but a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Neural classification maps for distinct word combinations in Broca’s area.Marianne Schell, Angela D. Friederici & Emiliano Zaccarella - 2022 - Frontiers in Human Neuroscience 16:930849.
    Humans are equipped with the remarkable ability to comprehend an infinite number of utterances. Relations between grammatical categories restrict the way words combine into phrases and sentences. How the brain recognizes different word combinations remains largely unknown, although this is a necessary condition for combinatorial unboundedness in language. Here, we used functional magnetic resonance imaging and multivariate pattern analysis to explore whether distinct neural populations of a known language network hub—Broca’s area—are specialized for recognizing distinct simple word combinations. The phrases (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Meaning and Formal Semantics in Generative Grammar.Stephen Schiffer - 2015 - Erkenntnis 80 (1):61-87.
    A generative grammar for a language L generates one or more syntactic structures for each sentence of L and interprets those structures both phonologically and semantically. A widely accepted assumption in generative linguistics dating from the mid-60s, the Generative Grammar Hypothesis , is that the ability of a speaker to understand sentences of her language requires her to have tacit knowledge of a generative grammar of it, and the task of linguistic semantics in those early days was taken to be (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Why Truth-Conditional Semantics in Generative Linguistics is Still the Better Bet.Toby Napoletano - 2017 - Erkenntnis 82 (3):673-692.
    In his “Meaning and Formal Semantics in Generative Grammar” (Erkenntnis 2015, 61–87), Stephen Schiffer argues that truth-conditional semantics is a poor fit with generative linguistics. In particular, he thinks that it fails to explain speakers’ abilities to understand the sentences of their language. In its place, he recommends his “Best Bet Theory”—a theory which aims to directly explain speakers’ abilities to mean things by their utterances and know what others mean by their utterances. I argue that Schiffer does not provide (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Conversational Role of Centered Contents.Max Kölbel - 2013 - Inquiry: An Interdisciplinary Journal of Philosophy 56 (2-3):97-121.
    Some philosophers, for example David Lewis, have argued for the need to introduce de se contents or centered contents, i.e. contents of thought and speech the correctness of believing which depends not only on the possible world one inhabits, but also on the location one occupies. Independently, philosophers like Robert Stalnaker (and also David Lewis) have developed the conversational score model of linguistic communication. This conversational model usually relies on a more standard conception of content according to which the correctness (...)
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Formal Models at the Core.Emmanuel Chemla, Isabelle Charnavel, Isabelle Dautriche, David Embick, Fred Lerdahl, Pritty Patel-Grosz, David Poeppel & Philippe Schlenker - 2023 - Cognitive Science 47 (3):e13267.
    The grammatical paradigm used to be a model for entire areas of cognitive science. Its primary tenet was that theories are axiomatic-like systems. A secondary tenet was that their predictions should be tested quickly and in great detail with introspective judgments. While the grammatical paradigm now often seems passé, we argue that in fact it continues to be as efficient as ever. Formal models are essential because they are explicit, highly predictive, and typically modular. They make numerous critical predictions, which (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • De Morgan's laws and NEG-raising: a syntactic view.Diego Gabriel Krivochen - 2018 - Linguistic Frontiers 1 (2):112-121.
    In this paper, we will motivate the application of specific rules of inference from the propositional calculus to natural language sentences. Specifically, we will analyse De Morgan’s laws, which pertain to the interaction of two central topics in syntactic research: negation and coordination. We will argue that the applicability of De Morgan’s laws to natural language structures can be derived from independently motivated operations of grammar and principles restricting the application of these operations. This has direct empirical consequences for the (...)
    Download  
     
    Export citation  
     
    Bookmark