Switch to: References

Add citations

You must login to add citations.
  1. Natural Language Semantics and Computability.Richard Moot & Christian Retoré - 2019 - Journal of Logic, Language and Information 28 (2):287-307.
    This paper is a reflexion on the computability of natural language semantics. It does not contain a new model or new results in the formal semantics of natural language: it is rather a computational analysis, in the context for type-logical grammars, of the logical models and algorithms currently used in natural language semantics, defined as a function from a grammatical sentence to a set of logical formulas—because a statement can be ambiguous, it can correspond to multiple formulas, one for each (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Natural Recursion Doesn’t Work That Way: Automata in Planning and Syntax.Cem Bozsahin - 2016 - In Vincent C. Müller (ed.), Fundamental Issues of Artificial Intelligence. Cham: Springer. pp. 95-112.
    Natural recursion in syntax is recursion by linguistic value, which is not syntactic in nature but semantic. Syntax-specific recursion is not recursion by name as the term is understood in theoretical computer science. Recursion by name is probably not natural because of its infinite typeability. Natural recursion, or recursion by value, is not species-specific. Human recursion is not syntax-specific. The values on which it operates are most likely domain-specific, including those for syntax. Syntax seems to require no more (and no (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Artificial Grammar Learning Capabilities in an Abstract Visual Task Match Requirements for Linguistic Syntax.Gesche Westphal-Fitch, Beatrice Giustolisi, Carlo Cecchetto, Jordan S. Martin & W. Tecumseh Fitch - 2018 - Frontiers in Psychology 9.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • From Computer Metaphor to Computational Modeling: The Evolution of Computationalism.Marcin Miłkowski - 2018 - Minds and Machines 28 (3):515-541.
    In this paper, I argue that computationalism is a progressive research tradition. Its metaphysical assumptions are that nervous systems are computational, and that information processing is necessary for cognition to occur. First, the primary reasons why information processing should explain cognition are reviewed. Then I argue that early formulations of these reasons are outdated. However, by relying on the mechanistic account of physical computation, they can be recast in a compelling way. Next, I contrast two computational models of working memory (...)
    Download  
     
    Export citation  
     
    Bookmark   10 citations  
  • Cognition and Content.João Branquinho - 2005 - Lisboa, Portugal: Centro de Filosofia da Universidade de Lisboa.
    Os tópicos e problemas filosóficos discutidos no volume são de natureza bastante variada: a natureza da complexidade computacional no processamento de uma língua natural; a relação entre o significado linguístico e o sentido Fregeano; as conexões entre a a agência e o poder; o conteúdo semântico da ficção; a explicação dos impasses éticos; a natureza dos argumentos cépticos; as conexões entre as dissociações cognitivas e o carácter modular da mente; a relação entre a referência e o significado. Estes tópicos deixam-se (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The emergence of language.Mark Steedman - 2017 - Mind and Language 32 (5):579-590.
    This paper argues that the faculty of language comes essentially for free in evolutionary terms, by grace of a capacity shared with some evolutionarily quite distantly related animals for deliberatively planning action in the world. The reason humans have language of a kind that animals do not is because of a qualitative difference in the nature of human plans rather than anything unique to language.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The Displacement Calculus.Glyn Morrill, Oriol Valentín & Mario Fadda - 2011 - Journal of Logic, Language and Information 20 (1):1-48.
    If all dependent expressions were adjacent some variety of immediate constituent analysis would suffice for grammar, but syntactic and semantic mismatches are characteristic of natural language; indeed this is a, or the, central problem in grammar. Logical categorial grammar reduces grammar to logic: an expression is well-formed if and only if an associated sequent is a theorem of a categorial logic. The paradigmatic categorial logic is the Lambek calculus, but being a logic of concatenation the Lambek calculus can only capture (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Do Humans Really Learn A n B n Artificial Grammars From Exemplars?Jean-Rémy Hochmann, Mahan Azadpour & Jacques Mehler - 2008 - Cognitive Science 32 (6):1021-1036.
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Statistical models of syntax learning and use.Mark Johnson & Stefan Riezler - 2002 - Cognitive Science 26 (3):239-253.
    This paper shows how to define probability distributions over linguistically realistic syntactic structures in a way that permits us to define language learning and language comprehension as statistical problems. We demonstrate our approach using lexical‐functional grammar (LFG), but our approach generalizes to virtually any linguistic theory. Our probabilistic models are maximum entropy models. In this paper we concentrate on statistical inference procedures for learning the parameters that define these probability distributions. We point out some of the practical problems that make (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Toward a Connectionist Model of Recursion in Human Linguistic Performance.Morten H. Christiansen & Nick Chater - 1999 - Cognitive Science 23 (2):157-205.
    Download  
     
    Export citation  
     
    Bookmark   65 citations  
  • What Complexity Differences Reveal About Domains in Language.Jeffrey Heinz & William Idsardi - 2013 - Topics in Cognitive Science 5 (1):111-131.
    An important distinction between phonology and syntax has been overlooked. All phonological patterns belong to the regular region of the Chomsky Hierarchy, but not all syntactic patterns do. We argue that the hypothesis that humans employ distinct learning mechanisms for phonology and syntax currently offers the best explanation for this difference.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Geometric Representations for Minimalist Grammars.Peter Beim Graben & Sabrina Gerth - 2012 - Journal of Logic, Language and Information 21 (4):393-432.
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of processing (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Introduction to the Special Issue on the Mathematics of Language.Gerald Penn - 2011 - Journal of Logic, Language and Information 20 (3):273-275.
    Download  
     
    Export citation  
     
    Bookmark  
  • The Philosophy of Generative Linguistics.Peter Ludlow - 2011 - Oxford, GB: Oxford University Press.
    Peter Ludlow presents the first book on the philosophy of generative linguistics, including both Chomsky's government and binding theory and his minimalist ...
    Download  
     
    Export citation  
     
    Bookmark   42 citations  
  • Contexts and the Concept of Mild Context-Sensitivity.M. Kudlek, C. Martín-Vide, A. Mateescu & V. Mitrana - 2003 - Linguistics and Philosophy 26 (6):703 - 725.
    We introduce and study a natural extension of Marcus external contextual grammars. This mathematically simple mechanism which generates a proper subclass of simple matrix languages, known to be mildly context-sensitive ones, is still mildly context-sensitive. Furthermore, we get an infinite hierarchy of mildly context-sensitive families of languages. Then we attempt to fill a gap regarding the linguistic relevance of these mechanisms which consists in defining a tree structure on the strings generated by many-dimensional external contextual grammars, and investigate some related (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Generalized quantifiers.Dag Westerståhl - 2008 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Argument or no argument?Geoffrey K. Pullum & Kyle Rawlins - 2007 - Linguistics and Philosophy 30 (2):277 - 287.
    We examine an argument for the non-context-freeness of English that has received virtually no discussion in the literature. It is based on adjuncts of the form 'X or no X', where X is a nominal. The construction has been held to exemplify unbounded syntactic reduplication. We argue that although the argument can be made in a mathematically valid form, its empirical basis is not secure. First, the claimed unbounded syntactic identity between nominals does not always hold in attested cases, and (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Languages and Other Abstract Structures.Ryan Mark Nefdt - 2018 - In Martin Neef & Christina Behme (eds.), Essays on Linguistic Realism. Philadelphia: John Benjamins Publishing Company. pp. 139-184.
    My aim in this chapter is to extend the Realist account of the foundations of linguistics offered by Postal, Katz and others. I first argue against the idea that naive Platonism can capture the necessary requirements on what I call a ‘mixed realist’ view of linguistics, which takes aspects of Platonism, Nominalism and Mentalism into consideration. I then advocate three desiderata for an appropriate ‘mixed realist’ account of linguistic ontology and foundations, namely (1) linguistic creativity and infinity, (2) linguistics as (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Notational Variants and Cognition: The Case of Dependency Grammar.Ryan M. Nefdt & Giosué Baggio - forthcoming - Erkenntnis:1-31.
    In recent years, dependency grammars have established themselves as valuable tools in theoretical and computational linguistics. To many linguists, dependency grammars and the more standard constituency-based formalisms are notational variants. We argue that, beyond considerations of formal equivalence, cognition may also serve as a background for a genuine comparison between these different views of syntax. In this paper, we review and evaluate some of the most common arguments and evidence employed to advocate for the cognitive or neural reality of dependency (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Mixed computation: grammar up and down the Chomsky Hierarchy.Diego Gabriel Krivochen - 2021 - Evolutionary Linguistic Theory 2 (3):215-244.
    Proof-theoretic models of grammar are based on the view that an explicit characterization of a language comes in the form of the recursive enumeration of strings in that language. That recur-sive enumeration is carried out by a procedure which strongly generates a set of structural de-scriptions Σ and weakly generates a set of strings S; a grammar is thus a function that pairs an element of Σ with elements of S. Structural descriptions are obtained by means of Context-Free phrase structure (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The Multiplicative-Additive Lambek Calculus with Subexponential and Bracket Modalities.Max Kanovich, Stepan Kuznetsov & Andre Scedrov - 2020 - Journal of Logic, Language and Information 30 (1):31-88.
    We give a proof-theoretic and algorithmic complexity analysis for systems introduced by Morrill to serve as the core of the CatLog categorial grammar parser. We consider two recent versions of Morrill’s calculi, and focus on their fragments including multiplicative connectives, additive conjunction and disjunction, brackets and bracket modalities, and the! subexponential modality. For both systems, we resolve issues connected with the cut rule and provide necessary modifications, after which we prove admissibility of cut. We also prove algorithmic undecidability for both (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Schwyzertuutsch, Bambara, and Context-Free Languages.Witold Kieraś - 2010 - Studia Semiotyczne—English Supplement 27:102-115.
    In his oft-cited, if rarely read book, Noam Chomsky ventured to propose a hierarchy of formal grammars. He defined them as rules for rewriting strings of terminal and nonterminal symbols into different strings of terminal and nonterminal symbols, thus giving birth to what is today known as the Chomsky hierarchy. The author himself claimed the theory applied exclusively to formal languages, but in his considerations he also happened to formulate a problem relating to natural languages, namely, to which formal grammar (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Phrase structure parameters.Janet Dean Fodor & Stephen Crain - 1990 - Linguistics and Philosophy 13 (6):619-659.
    Download  
     
    Export citation  
     
    Bookmark  
  • Why Philosophers should do Semantics : a Reply to Cappelen.Ryan M. Nefdt - 2019 - Review of Philosophy and Psychology 10 (1):243-256.
    In this paper, I address a series of arguments recently put forward by Cappelen Review of Philosophy and Psychology 8: 743–762 to the effect that philosophers should not do formal semantics or be concerned with the “minutiae of natural language semantics”. He offers two paths for accessing his ideas. I argue that his arguments fail in favour of the first and cast some doubt on the second in so doing. I then proffer an alternative conception of why exactly philosophers should (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Dendrophobia in Bonobo Comprehension of Spoken English.Truswell Robert - 2017 - Mind and Language 32 (4):395-415.
    Data from a study by Savage-Rumbaugh and colleagues on comprehension of spoken English requests by a bonobo and a human infant supports Fitch's hypothesis that humans exhibit dendrophilia, or a propensity to manipulate tree structures, to a greater extent than other species. However, findings from language acquisition suggest that human infants do not show an initial preference for certain hierarchical syntactic structures. Infants are slow to acquire and generalize the structures in question, but they can eventually do so. Kanzi, in (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • The language faculty that wasn't: a usage-based account of natural language recursion.Morten H. Christiansen & Nick Chater - 2015 - Frontiers in Psychology 6:150920.
    In the generative tradition, the language faculty has been shrinking—perhaps to include only the mechanism of recursion. This paper argues that even this view of the language faculty is too expansive. We first argue that a language faculty is difficult to reconcile with evolutionary considerations. We then focus on recursion as a detailed case study, arguing that our ability to process recursive structure does not rely on recursion as a property of the grammar, but instead emerges gradually by piggybacking on (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Labels, cognomes, and cyclic computation: an ethological perspective.Elliot Murphy - 2015 - Frontiers in Psychology 6:144329.
    For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i) survey what the ethological record reveals about the uniqueness of the human computational system, and (ii) explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Expectation-based syntactic comprehension.Roger Levy - 2008 - Cognition 106 (3):1126-1177.
    Download  
     
    Export citation  
     
    Bookmark   202 citations  
  • Paycheck Pronouns, Bach-Peters Sentences, and Variable-Free Semantics.Pauline Jacobson - 2000 - Natural Language Semantics 8 (2):77-155.
    This paper argues for the hypothesis of direct compositionality (as in, e.g., Montague 1974), according to which the combinatory syntactic rules specify a set of well-formed expressions while the semantic combinatory rules work in tandem to directly supply a model-theoretic interpretation to each expression as it is "built" in the syntax. (This thus obviates the need for any level like LF and, concomitantly, for any rules mapping surface structures to such a level.) I focus here on one related group of (...)
    Download  
     
    Export citation  
     
    Bookmark   32 citations  
  • Restricting grammatical complexity.Robert Frank - 2004 - Cognitive Science 28 (5):669-697.
    Theories of natural language syntax often characterize grammatical knowledge as a form of abstract computation. This paper argues that such a characterization is correct, and that fundamental properties of grammar can and should be understood in terms of restrictions on the complexity of possible grammatical computation, when defined in terms of generative capacity. More specifically, the paper demonstrates that the computational restrictiveness imposed by Tree Adjoining Grammar provides important insights into the nature of human grammatical knowledge.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Varieties of crossing dependencies: structure dependence and mild context sensitivity.Edward P. Stabler - 2004 - Cognitive Science 28 (5):699-720.
    Four different kinds of grammars that can define crossing dependencies in human language are compared here: (i) context sensitive rewrite grammars with rules that depend on context, (ii) matching grammars with constraints that filter the generative structure of the language, (iii) copying grammars which can copy structures of unbounded size, and (iv) generating grammars in which crossing dependencies are generated from a finite lexical basis. Context sensitive rewrite grammars are syntactically, semantically and computationally unattractive. Generating grammars have a collection of (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • The processing of extraposed structures in English.Roger Levy, Evelina Fedorenko, Mara Breen & Edward Gibson - 2012 - Cognition 122 (1):12-36.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Constructions and Grammatical Explanation: Comments on Goldberg.David Adger - 2013 - Mind and Language 28 (4):466-478.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • The Central Question in Comparative Syntactic Metatheory.Geoffrey K. Pullum - 2013 - Mind and Language 28 (4):492-521.
    Two kinds of theoretical framework for syntax are encountered in current linguistics. One emerged from the mathematization of proof theory, and is referred to here as generative-enumerative syntax (GES). A less explored alternative stems from the semantic side of logic, and is here called model-theoretic syntax (MTS). I sketch the outlines of each, and give a capsule summary of some mathematical results pertaining to the latter. I then briefly survey some diverse types of evidence suggesting that in some ways MTS (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Implicit Acquisition of Grammars With Crossed and Nested Non-Adjacent Dependencies: Investigating the Push-Down Stack Model.Julia Uddén, Martin Ingvar, Peter Hagoort & Karl M. Petersson - 2012 - Cognitive Science 36 (6):1078-1101.
    A recent hypothesis in empirical brain research on language is that the fundamental difference between animal and human communication systems is captured by the distinction between finite-state and more complex phrase-structure grammars, such as context-free and context-sensitive grammars. However, the relevance of this distinction for the study of language as a neurobiological system has been questioned and it has been suggested that a more relevant and partly analogous distinction is that between non-adjacent and adjacent dependencies. Online memory resources are central (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Systematicity and Natural Language Syntax.Barbara C. Scholz - 2007 - Croatian Journal of Philosophy 7 (3):375-402.
    A lengthy debate in the philosophy of the cognitive sciences has turned on whether the phenomenon known as ‘systematicity’ of language and thought shows that connectionist explanatory aspirations are misguided. We investigate the issue of just which phenomenon ‘systematicity’ is supposed to be. The much-rehearsed examples always suggest that being systematic has something to do with ways in which some parts of expressions in natural languages (and, more conjecturally, some parts of thoughts) can be substituted for others without altering well-formedness. (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Gapping as constituent coordination.Mark J. Steedman - 1990 - Linguistics and Philosophy 13 (2):207 - 263.
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • A linear precedence account of cross-serial dependencies.Almerindo E. Ojeda - 1988 - Linguistics and Philosophy 11 (4):457 - 492.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • The emergence of syntactic structure.Marcus Kracht - 2007 - Linguistics and Philosophy 30 (1):47 - 95.
    The present paper is the result of a long struggle to understand how the notion of compositionality can be used to motivate the structure of a sentence. While everyone seems to have intuitions about which proposals are compositional and which ones are not, these intuitions generally have no formal basis. What is needed to make such arguments work is a proper understanding of what meanings are and how they can be manipulated. In particular, we need a definition of meaning that (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Residuation, structural rules and context freeness.Gerhard Jäger - 2004 - Journal of Logic, Language and Information 13 (1):47-59.
    The article presents proofs of the context freeness of a family of typelogical grammars, namely all grammars that are based on a uni- ormultimodal logic of pure residuation, possibly enriched with thestructural rules of Permutation and Expansion for binary modes.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • On the Necessity of U-Shaped Learning.Lorenzo Carlucci & John Case - 2013 - Topics in Cognitive Science 5 (1):56-88.
    A U-shaped curve in a cognitive-developmental trajectory refers to a three-step process: good performance followed by bad performance followed by good performance once again. U-shaped curves have been observed in a wide variety of cognitive-developmental and learning contexts. U-shaped learning seems to contradict the idea that learning is a monotonic, cumulative process and thus constitutes a challenge for competing theories of cognitive development and learning. U-shaped behavior in language learning (in particular in learning English past tense) has become a central (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Querying linguistic trees.Catherine Lai & Steven Bird - 2010 - Journal of Logic, Language and Information 19 (1):53-73.
    Large databases of linguistic annotations are used for testing linguistic hypotheses and for training language processing models. These linguistic annotations are often syntactic or prosodic in nature, and have a hierarchical structure. Query languages are used to select particular structures of interest, or to project out large slices of a corpus for external analysis. Existing languages suffer from a variety of problems in the areas of expressiveness, efficiency, and naturalness for linguistic query. We describe the domain of linguistic trees and (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Unbounded syntactic copying in mandarin chinese.Daniel Radzinski - 1990 - Linguistics and Philosophy 13 (1):113 - 127.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Dutch as a formal language.Alexis Manaster-Ramer - 1987 - Linguistics and Philosophy 10 (2):221 - 246.
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Complex predicates and liberation in dutch and English.Jack Hoeksema - 1991 - Linguistics and Philosophy 14 (6):661 - 710.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Language Learnability in the Limit: A Generalization of Gold’s Theorem.Fernando C. Alves - 2023 - Journal of Logic, Language and Information 32 (3):363-372.
    In his pioneering work in the field of inductive inference, Gold (Inf Control 10:447–474, 1967) proved that a set containing all finite languages and at least one infinite language over the same fixed alphabet is not identifiable in the limit (learnable in the exact sense) from complete texts. Gold’s work paved the way for computational learning theories of language and has implications for two linguistically relevant classes in the Chomsky hierarchy (cf. Chomsky in Inf Control 2:137–167, 1959, Chomsky in Knowledge (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Variation in mild context-sensitivity.Robert Frank & Tim Hunter - 2021 - Evolutionary Linguistic Theory 3 (2):181-214.
    Aravind Joshi famously hypothesized that natural language syntax was characterized (in part) by mildly context-sensitive generative power. Subsequent work in mathematical linguistics over the past three decades has revealed surprising convergences among a wide variety of grammatical formalisms, all of which can be said to be mildly context-sensitive. But this convergence is not absolute. Not all mildly context-sensitive formalisms can generate exactly the same stringsets (i.e. they are not all weakly equivalent), and even when two formalisms can both generate a (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Mixed computation.Diego Gabriel Krivochen - 2021 - Evolutionary Linguistic Theory 3 (2):215-244.
    Proof-theoretic models of grammar are based on the view that an explicit characterization of a language comes in the form of the recursive enumeration of strings in that language. That recursive enumeration is carried out by a procedure which strongly generates a set of structural descriptions Σ and weakly generates a set of strings S; a grammar is thus a function that pairs an element of Σ with elements of S. Structural descriptions are obtained by means of Context-Free phrase structure (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Performance of Deaf Participants in an Abstract Visual Grammar Learning Task at Multiple Formal Levels: Evaluating the Auditory Scaffolding Hypothesis.Beatrice Giustolisi, Jordan S. Martin, Gesche Westphal-Fitch, W. Tecumseh Fitch & Carlo Cecchetto - 2022 - Cognitive Science 46 (2):e13114.
    Cognitive Science, Volume 46, Issue 2, February 2022.
    Download  
     
    Export citation  
     
    Bookmark  
  • Infinitary action logic with exponentiation.Stepan L. Kuznetsov & Stanislav O. Speranski - 2022 - Annals of Pure and Applied Logic 173 (2):103057.
    Download  
     
    Export citation  
     
    Bookmark   1 citation