Switch to: Citations

Add references

You must login to add references.
  1. (4 other versions)Rules and representations.Noam Chomsky - 1980 - Behavioral and Brain Sciences 3 (1):1-15.
    The book from which these sections are excerpted is concerned with the prospects for assimilating the study of human intelligence and its products to the natural sciences through the investigation of cognitive structures, understood as systems of rules and representations that can be regarded as “mental organs.” These mental structui′es serve as the vehicles for the exercise of various capacities. They develop in the mind on the basis of an innate endowment that permits the growth of rich and highly articulated (...)
    Download  
     
    Export citation  
     
    Bookmark   1160 citations  
  • Bigrams and the Richness of the Stimulus.Xuân-Nga Cao Kam, Iglika Stoyneshka, Lidiya Tornyova, Janet D. Fodor & William G. Sakas - 2008 - Cognitive Science 32 (4):771-787.
    Recent challenges to Chomsky's poverty of the stimulus thesis for language acquisition suggest that children's primary data may carry “indirect evidence” about linguistic constructions despite containing no instances of them. Indirect evidence is claimed to suffice for grammar acquisition, without need for innate knowledge. This article reports experiments based on those of, who demonstrated that a simple bigram language model can induce the correct form of auxiliary inversion in certain complex questions. This article investigates the nature of the indirect evidence (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • Toward a Connectionist Model of Recursion in Human Linguistic Performance.Morten H. Christiansen & Nick Chater - 1999 - Cognitive Science 23 (2):157-205.
    Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing recursive language structures. The model is trained on simple artificial languages. We find that the qualitative performance profile of the model matches human behavior, both on the relative difficulty of center‐embedding and cross‐dependency, and between the processing of these complex recursive structures and (...)
    Download  
     
    Export citation  
     
    Bookmark   72 citations  
  • From Exemplar to Grammar: A Probabilistic Analogy‐Based Model of Language Learning.Rens Bod - 2009 - Cognitive Science 33 (5):752-793.
    While rules and exemplars are usually viewed as opposites, this paper argues that they form end points of the same distribution. By representing both rules and exemplars as (partial) trees, we can take into account the fluid middle ground between the two extremes. This insight is the starting point for a new theory of language learning that is based on the following idea: If a language learner does not know which phrase‐structure trees should be assigned to initial sentences, s/he allows (...)
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Connectionist Models and Their Properties.J. A. Feldman & D. H. Ballard - 1982 - Cognitive Science 6 (3):205-254.
    Much of the progress in the fields constituting cognitive science has been based upon the use of explicit information processing models, almost exclusively patterned after conventional serial computers. An extension of these ideas to massively parallel, connectionist models appears to offer a number of advantages. After a preliminary discussion, this paper introduces a general connectionist model and considers how it might be used in cognitive science. Among the issues addressed are: stability and noise‐sensitivity, distributed decision‐making, time and sequence problems, and (...)
    Download  
     
    Export citation  
     
    Bookmark   445 citations  
  • Language as shaped by the brain.Morten H. Christiansen & Nick Chater - 2008 - Behavioral and Brain Sciences 31 (5):489-509.
    It is widely assumed that human learning and the structure of human languages are intimately related. This relationship is frequently suggested to derive from a language-specific biological endowment, which encodes universal, but communicatively arbitrary, principles of language structure (a Universal Grammar or UG). How might such a UG have evolved? We argue that UG could not have arisen either by biological adaptation or non-adaptationist genetic processes, resulting in a logical problem of language evolution. Specifically, as the processes of language change (...)
    Download  
     
    Export citation  
     
    Bookmark   146 citations  
  • Empirical assessment of stimulus poverty arguments.Geoffrey K. Pullum - 2002 - Linguistic Review.
    Download  
     
    Export citation  
     
    Bookmark   59 citations  
  • (1 other version)Finding Structure in Time.Jeffrey L. Elman - 1990 - Cognitive Science 14 (2):179-211.
    Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implicitly by its effects on processing rather than explicitly (as in a spatial representation). The current report develops a proposal along these lines first described by Jordan (1986) which involves the use of recurrent links in order to provide networks with a dynamic memory. In this approach, hidden unit patterns are fed back to themselves: (...)
    Download  
     
    Export citation  
     
    Bookmark   518 citations  
  • Rules and representations.Noam Chomsky (ed.) - 1980 - New York: Columbia University Press.
    In Rules and Representations, first published in 1980, Noam Chomsky lays out many of the concepts that have made his approach to linguistics and human cognition so instrumental to our understanding of language.Chomsky arrives at his well-known position that there is a universal grammar, structured in the human mind and common to all human languages. Based on Chomsky's 1978 Woodbridge Lectures, this edition contains revised versions of the lectures and two new essays.
    Download  
     
    Export citation  
     
    Bookmark   619 citations  
  • TRACX: A recognition-based connectionist framework for sequence segmentation and chunk extraction.Robert M. French, Caspar Addyman & Denis Mareschal - 2011 - Psychological Review 118 (4):614-636.
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  • Modeling human performance in statistical word segmentation.Michael C. Frank, Sharon Goldwater, Thomas L. Griffiths & Joshua B. Tenenbaum - 2010 - Cognition 117 (2):107-125.
    Download  
     
    Export citation  
     
    Bookmark   40 citations  
  • Uncovering the Richness of the Stimulus: Structure Dependence and Indirect Statistical Evidence.Florencia Reali & Morten H. Christiansen - 2005 - Cognitive Science 29 (6):1007-1028.
    The poverty of stimulus argument is one of the most controversial arguments in the study of language acquisition. Here we follow previous approaches challenging the assumption of impoverished primary linguistic data, focusing on the specific problem of auxiliary (AUX) fronting in complex polar interrogatives. We develop a series of corpus analyses of child‐directed speech showing that there is indirect statistical information useful for correct auxiliary fronting in polar interrogatives and that such information is sufficient for distinguishing between grammatical and ungrammatical (...)
    Download  
     
    Export citation  
     
    Bookmark   36 citations  
  • The phonological loop as a language learning device.Alan Baddeley, Susan Gathercole & Costanza Papagno - 1998 - Psychological Review 105 (1):158-173.
    Download  
     
    Export citation  
     
    Bookmark   75 citations  
  • Memory for serial order: A network model of the phonological loop and its timing.Neil Burgess & Graham J. Hitch - 1999 - Psychological Review 106 (3):551-581.
    Download  
     
    Export citation  
     
    Bookmark   46 citations  
  • On the nature of minds, or: Truth and consequences.Shimon Edelman - 2008 - Journal of Experimental and Theoretical Ai 20:181-196.
    Are minds really dynamical or are they really symbolic? Because minds are bundles of computations, and because computation is always a matter of interpretation of one system by another, minds are necessarily symbolic. Because minds, along with everything else in the universe, are physical, and insofar as the laws of physics are dynamical, minds are necessarily dynamical systems. Thus, the short answer to the opening question is “yes.” It makes sense to ask further whether some of the computations that constitute (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Learn Locally, Act Globally: Learning Language from Variation Set Cues.Luca Onnis, Heidi R. Waterfall & Shimon Edelman - 2008 - Cognition 109 (3):423.
    Download  
     
    Export citation  
     
    Bookmark   23 citations  
  • Distributional regularity and phonotactic constraints are useful for segmentation.Michael R. Brent & Timothy A. Cartwright - 1996 - Cognition 61 (1-2):93-125.
    Download  
     
    Export citation  
     
    Bookmark   69 citations  
  • (2 other versions)Syntactic Structures.J. F. Staal - 1966 - Journal of Symbolic Logic 31 (2):245-251.
    Download  
     
    Export citation  
     
    Bookmark   449 citations  
  • A Memory‐Based Theory of Verbal Cognition.Simon Dennis - 2005 - Cognitive Science 29 (2):145-193.
    The syntagmatic paradigmatic model is a distributed, memory‐based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long‐term memory and the resolution of these constraints in working memory. Lexical information is extracted directly from text using a version of the expectation maximization algorithm. In this article, the model is described and then illustrated on a number (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Lexical and Sublexical Units in Speech Perception.Ibrahima Giroux & Arnaud Rey - 2009 - Cognitive Science 33 (2):260-272.
    Saffran, Newport, and Aslin (1996a) found that human infants are sensitive to statistical regularities corresponding to lexical units when hearing an artificial spoken language. Two sorts of segmentation strategies have been proposed to account for this early word‐segmentation ability: bracketing strategies, in which infants are assumed to insert boundaries into continuous speech, and clustering strategies, in which infants are assumed to group certain speech sequences together into units (Swingley, 2005). In the present study, we test the predictions of two computational (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations