Switch to: References

Add citations

You must login to add citations.
  1. Public language, private language, and subsymbolic theories of mind.Gabe Dupre - 2023 - Mind and Language 38 (2):394-412.
    Language has long been a problem‐case for subsymbolic theories of mind. The reason for this is obvious: Language seems essentially symbolic. However, recent work has developed a potential solution to this problem, arguing that linguistic symbols are public objects which augment a fundamentally subsymbolic mind, rather than components of cognitive symbol‐processing. I shall argue that this strategy cannot work, on the grounds that human language acquisition consists in projecting linguistic structure onto environmental entities, rather than extracting this structure from them.
    Download  
     
    Export citation  
     
    Bookmark  
  • Dynamical Systems Implementation of Intrinsic Sentence Meaning.Hermann Moisl - 2022 - Minds and Machines 32 (4):627-653.
    This paper proposes a model for implementation of intrinsic natural language sentence meaning in a physical language understanding system, where 'intrinsic' is understood as 'independent of meaning ascription by system-external observers'. The proposal is that intrinsic meaning can be implemented as a point attractor in the state space of a nonlinear dynamical system with feedback which is generated by temporally sequenced inputs. It is motivated by John Searle's well known (Behavioral and Brain Sciences, 3: 417–57, 1980) critique of the then-standard (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2015 - Cognitive Science 39 (2):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural‐language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Doing Without Schema Hierarchies: A Recurrent Connectionist Approach to Normal and Impaired Routine Sequential Action.Matthew Botvinick & David C. Plaut - 2004 - Psychological Review 111 (2):395-429.
    Download  
     
    Export citation  
     
    Bookmark   54 citations  
  • The language faculty that wasn't: a usage-based account of natural language recursion.Morten H. Christiansen & Nick Chater - 2015 - Frontiers in Psychology 6:150920.
    In the generative tradition, the language faculty has been shrinking—perhaps to include only the mechanism of recursion. This paper argues that even this view of the language faculty is too expansive. We first argue that a language faculty is difficult to reconcile with evolutionary considerations. We then focus on recursion as a detailed case study, arguing that our ability to process recursive structure does not rely on recursion as a property of the grammar, but instead emerges gradually by piggybacking on (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • The Now-or-Never bottleneck: A fundamental constraint on language.Morten H. Christiansen & Nick Chater - 2016 - Behavioral and Brain Sciences 39:e62.
    Memory is fleeting. New material rapidly obliterates previous material. How, then, can the brain deal successfully with the continual deluge of linguistic input? We argue that, to deal with this “Now-or-Never” bottleneck, the brain must compress and recode linguistic input as rapidly as possible. This observation has strong implications for the nature of language processing: (1) the language system must “eagerly” recode and compress linguistic input; (2) as the bottleneck recurs at each new representational level, the language system must build (...)
    Download  
     
    Export citation  
     
    Bookmark   76 citations  
  • (1 other version)Grammatical pattern learning by human infants and cotton-top tamarin monkeys.Jenny Saffran, Marc Hauser, Rebecca Seibel, Joshua Kapfhamer, Fritz Tsao & Fiery Cushman - 2008 - Cognition 107 (2):479-500.
    Download  
     
    Export citation  
     
    Bookmark   15 citations  
  • Sequential Expectations: The Role of Prediction‐Based Learning in Language.Jennifer B. Misyak, Morten H. Christiansen & J. Bruce Tomblin - 2010 - Topics in Cognitive Science 2 (1):138-153.
    Prediction‐based processes appear to play an important role in language. Few studies, however, have sought to test the relationship within individuals between prediction learning and natural language processing. This paper builds upon existing statistical learning work using a novel paradigm for studying the on‐line learning of predictive dependencies. Within this paradigm, a new “prediction task” is introduced that provides a sensitive index of individual differences for developing probabilistic sequential expectations. Across three interrelated experiments, the prediction task and results thereof are (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  • Learning non-local dependencies.Gustav Kuhn & Zoltán Dienes - 2008 - Cognition 106 (1):184-206.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Linguistic complexity: locality of syntactic dependencies.Edward Gibson - 1998 - Cognition 68 (1):1-76.
    Download  
     
    Export citation  
     
    Bookmark   158 citations  
  • The nature of the memory buffer in implicit learning: Learning Chinese tonal symmetries.Feifei Li, Shan Jiang, Xiuyan Guo, Zhiliang Yang & Zoltan Dienes - 2013 - Consciousness and Cognition 22 (3):920-930.
    Previous research has established that people can implicitly learn chunks, which do not require a memory buffer to process. The present study explores the implicit learning of nonlocal dependencies generated by higher than finite-state grammars, specifically, Chinese tonal retrogrades and inversions , which do require buffers . People were asked to listen to and memorize artificial poetry instantiating one of the two grammars; after this training phase, people were informed of the existence of rules and asked to classify new poems, (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • (1 other version)Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2014 - Cognitive Science 38 (4):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • An alternative view of the mental lexicon.Jeffrey Elman L. - 2004 - Trends in Cognitive Sciences 8 (7):301-306.
    An essential aspect of knowing language is knowing the words of that language. This knowledge is usually thought to reside in the mental lexicon, a kind of dictionary that contains information regarding a word’s meaning, pronunciation, syntactic characteristics, and so on. In this article, a very different view is presented. In this view, words are understood as stimuli that operate directly on mental states. The phonological, syntactic and semantic properties of a word are revealed by the effects it has on (...)
    Download  
     
    Export citation  
     
    Bookmark   27 citations  
  • Connectionist Models of Language Production: Lexical Access and Grammatical Encoding.Gary S. Dell, Franklin Chang & Zenzi M. Griffin - 1999 - Cognitive Science 23 (4):517-542.
    Theories of language production have long been expressed as connectionist models. We outline the issues and challenges that must be addressed by connectionist models of lexical access and grammatical encoding, and review three recent models. The models illustrate the value of an interactive activation approach to lexical access in production, the need for sequential output in both phonological and grammatical encoding, and the potential for accounting for structural effects on errors and structural priming from learning.
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • (1 other version)Connectionist Natural Language Processing: The State of the Art.Morten H. Christiansen & Nick Chater - 1999 - Cognitive Science 23 (4):417-437.
    This Special Issue on Connectionist Models of Human Language Processing provides an opportunity for an appraisal both of specific connectionist models and of the status and utility of connectionist models of language in general. This introduction provides the background for the papers in the Special Issue. The development of connectionist models of language is traced, from their intellectual origins, to the state of current research. Key themes that arise throughout different areas of connectionist psycholinguistics are highlighted, and recent developments in (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • Distributional Information: A Powerful Cue for Acquiring Syntactic Categories.Martin Redington, Nick Chater & Steven Finch - 1998 - Cognitive Science 22 (4):425-469.
    Many theorists have dismissed a priori the idea that distributional information could play a significant role in syntactic category acquisition. We demonstrate empirically that such information provides a powerful cue to syntactic category membership, which can be exploited by a variety of simple, psychologically plausible mechanisms. We present a range of results using a large corpus of child‐directed speech and explore their psychological implications. While our results show that a considerable amount of information concerning the syntactic categories can be obtained (...)
    Download  
     
    Export citation  
     
    Bookmark   66 citations  
  • The myth of language universals: Language diversity and its importance for cognitive science.Nicholas Evans & Stephen C. Levinson - 2009 - Behavioral and Brain Sciences 32 (5):429-448.
    Talk of linguistic universals has given cognitive scientists the impression that languages are all built to a common pattern. In fact, there are vanishingly few universals of language in the direct sense that all languages exhibit them. Instead, diversity can be found at almost every level of linguistic organization. This fundamentally changes the object of enquiry from a cognitive science perspective. This target article summarizes decades of cross-linguistic work by typologists and descriptive linguists, showing just how few and unprofound the (...)
    Download  
     
    Export citation  
     
    Bookmark   183 citations  
  • Connectionism.James Garson & Cameron Buckner - 2019 - Stanford Encyclopedia of Philosophy.
    Download  
     
    Export citation  
     
    Bookmark   26 citations  
  • Artificial syntactic violations activate Broca's region.K. Petersson - 2004 - Cognitive Science 28 (3):383-407.
    In the present study, using event-related functional magnetic resonance imaging, we investigated a group of participants on a grammaticality classification task after they had been exposed to well-formed consonant strings generated from an artificial regular grammar. We used an implicit acquisition paradigm in which the participants were exposed to positive examples. The objective of this studywas to investigate whether brain regions related to language processing overlap with the brain regions activated by the grammaticality classification task used in the present study. (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Harmony in Linguistic Cognition.Paul Smolensky - 2006 - Cognitive Science 30 (5):779-801.
    In this article, I survey the integrated connectionist/symbolic (ICS) cognitive architecture in which higher cognition must be formally characterized on two levels of description. At the microlevel, parallel distributed processing (PDP) characterizes mental processing; this PDP system has special organization in virtue of which it can be characterized at the macrolevel as a kind of symbolic computational system. The symbolic system inherits certain properties from its PDP substrate; the symbolic functions computed constitute optimization of a well-formedness measure called Harmony. The (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • The Linguistic Subversion of Mental Representation.Whit Schonbein - 2012 - Minds and Machines 22 (3):235-262.
    Embedded and embodied approaches to cognition urge that (1) complicated internal representations may be avoided by letting features of the environment drive behavior, and (2) environmental structures can play an enabling role in cognition, allowing prior cognitive processes to solve novel tasks. Such approaches are thus in a natural position to oppose the ‘thesis of linguistic structuring’: The claim that the ability to use language results in a wholesale recapitulation of linguistic structure in onboard mental representation. Prominent examples of researchers (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • (1 other version)Connectionist Sentence Processing in Perspective.H. Cres, I. Rossi & M. Steedman - 1999 - Cognitive Science 23 (4):615-634.
    The emphasis in the connectionist sentence‐processing literature on distributed representation and emergence of grammar from such systems can easily obscure the often close relations between connectionist and symbolist systems. This paper argues that the Simple Recurrent Network (SRN) models proposed by Jordan (1989) and Elman (1990) are more directly related to stochastic Part‐of‐Speech (POS) Taggers than to parsers or grammars as such, while auto‐associative memory models of the kind pioneered by Longuet–Higgins, Willshaw, Pollack and others may be useful for grammar (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Specifically Human: Going Beyond Perceptual Syntax. [REVIEW]Piera Filippi - 2014 - Biosemiotics 7 (1):111-123.
    The aim of this paper is to help refine the definition of humans as “linguistic animals” in light of a comparative approach on nonhuman animals’ cognitive systems. As Uexküll & Kriszat (1934/1992) have theorized, the epistemic access to each species-specific environment (Umwelt) is driven by different biocognitive processes. Within this conceptual framework, I identify the salient cognitive process that distinguishes each species typical perception of the world as the faculty of language meant in the following operational definition: the ability to (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Recursion Isn’t Necessary for Human Language Processing: NEAR (Non-iterative Explicit Alternatives Rule) Grammars are Superior.Kenneth R. Paap & Derek Partridge - 2014 - Minds and Machines 24 (4):389-414.
    Language sciences have long maintained a close and supposedly necessary coupling between the infinite productivity of the human language faculty and recursive grammars. Because of the formal equivalence between recursion and non-recursive iteration; recursion, in the technical sense, is never a necessary component of a generative grammar. Contrary to some assertions this equivalence extends to both center-embedded relative clauses and hierarchical parse trees. Inspection of language usage suggests that recursive rule components in fact contribute very little, and likely nothing significant, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • On the Meaning of Words and Dinosaur Bones: Lexical Knowledge Without a Lexicon.Jeffrey L. Elman - 2009 - Cognitive Science 33 (4):547-582.
    Although for many years a sharp distinction has been made in language research between rules and words—with primary interest on rules—this distinction is now blurred in many theories. If anything, the focus of attention has shifted in recent years in favor of words. Results from many different areas of language research suggest that the lexicon is representationally rich, that it is the source of much productive behavior, and that lexically specific information plays a critical and early role in the interpretation (...)
    Download  
     
    Export citation  
     
    Bookmark   60 citations  
  • (1 other version)Grammatical pattern learning by human infants and cotton-top tamarin monkeys.Fiery Cushman Jenny Saffran, Marc Hauser, Rebecca Seibel, Joshua Kapfhamer, Fritz Tsao - 2008 - Cognition 107 (2):479.
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • Consequences of the Serial Nature of Linguistic Input for Sentenial Complexity.Daniel Grodner & Edward Gibson - 2005 - Cognitive Science 29 (2):261-290.
    All other things being equal the parser favors attaching an ambiguous modifier to the most recent possible site. A plausible explanation is that locality preferences such as this arise in the service of minimizing memory costs—more distant sentential material is more difficult to reactivate than more recent material. Note that processing any sentence requires linking each new lexical item with material in the current parse. This often involves the construction of long‐distance dependencies. Under a resource‐limited view of language processing, lengthy (...)
    Download  
     
    Export citation  
     
    Bookmark   48 citations  
  • Cooperation, psychological game theory, and limitations of rationality in social interaction.Andrew M. Colman - 2003 - Behavioral and Brain Sciences 26 (2):139-153.
    Rational choice theory enjoys unprecedented popularity and influence in the behavioral and social sciences, but it generates intractable problems when applied to socially interactive decisions. In individual decisions, instrumental rationality is defined in terms of expected utility maximization. This becomes problematic in interactive decisions, when individuals have only partial control over the outcomes, because expected utility maximization is undefined in the absence of assumptions about how the other participants will behave. Game theory therefore incorporates not only rationality but also common (...)
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • Psycholinguistics, computational.Richard L. Lewis - 2003 - In L. Nadel (ed.), Encyclopedia of Cognitive Science. Nature Publishing Group.
    Download  
     
    Export citation  
     
    Bookmark  
  • Expectation-based syntactic comprehension.Roger Levy - 2008 - Cognition 106 (3):1126-1177.
    Download  
     
    Export citation  
     
    Bookmark   206 citations  
  • Centre-embedded structures are a by-product of associative learning and working memory constraints: Evidence from baboons ( Papio Papio ).Arnaud Rey, Pierre Perruchet & Joël Fagot - 2012 - Cognition 123 (1):180-184.
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • Two ways of learning associations.Luke Boucher & Zoltán Dienes - 2003 - Cognitive Science 27 (6):807-842.
    How people learn chunks or associations between adjacent items in sequences was modelled. Two previously successful models of how people learn artificial grammars were contrasted: the CCN, a network version of the competitive chunker of Servan‐Schreiber and Anderson [J. Exp. Psychol.: Learn. Mem. Cogn. 16 (1990) 592], which produces local and compositionally‐structured chunk representations acquired incrementally; and the simple recurrent network (SRN) of Elman [Cogn. Sci. 14 (1990) 179], which acquires distributed representations through error correction. The models' susceptibility to two (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • With diversity in mind: Freeing the language sciences from Universal Grammar.Nicholas Evans & Stephen C. Levinson - 2009 - Behavioral and Brain Sciences 32 (5):472-492.
    Our response takes advantage of the wide-ranging commentary to clarify some aspects of our original proposal and augment others. We argue against the generative critics of our coevolutionary program for the language sciences, defend the use of close-to-surface models as minimizing cross-linguistic data distortion, and stress the growing role of stochastic simulations in making generalized historical accounts testable. These methods lead the search for general principles away from idealized representations and towards selective processes. Putting cultural evolution central in understanding language (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Language learning in infancy: Does the empirical evidence support a domain specific language acquisition device?Christina Behme & Helene Deacon - 2008 - Philosophical Psychology 21 (5):641 – 671.
    Poverty of the Stimulus Arguments have convinced many linguists and philosophers of language that a domain specific language acquisition device (LAD) is necessary to account for language learning. Here we review empirical evidence that casts doubt on the necessity of this domain specific device. We suggest that more attention needs to be paid to the early stages of language acquisition. Many seemingly innate language-related abilities have to be learned over the course of several months. Further, the language input contains rich (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Input and Age‐Dependent Variation in Second Language Learning: A Connectionist Account.Marius Janciauskas & Franklin Chang - 2018 - Cognitive Science 42 (S2):519-554.
    Language learning requires linguistic input, but several studies have found that knowledge of second language rules does not seem to improve with more language exposure. One reason for this is that previous studies did not factor out variation due to the different rules tested. To examine this issue, we reanalyzed grammaticality judgment scores in Flege, Yeni-Komshian, and Liu's study of L2 learners using rule-related predictors and found that, in addition to the overall drop in performance due to a sensitive period, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Discovery of a Recursive Principle: An Artificial Grammar Investigation of Human Learning of a Counting Recursion Language.Pyeong Whan Cho, Emily Szkudlarek & Whitney Tabor - 2016 - Frontiers in Psychology 7.
    Download  
     
    Export citation  
     
    Bookmark  
  • Syntactic structure assembly in human parsing: a computational model based on competitive inhibition and a lexicalist grammar.Theo Vosse & Gerard Kempen - 2000 - Cognition 75 (2):105-143.
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  • (1 other version)Dynamical Models of Sentence Processing.Whitney Tabor & Michael K. Tanenhaus - 1999 - Cognitive Science 23 (4):491-515.
    We suggest that the theory of dynamical systems provides a revealing general framework for modeling the representations and mechanism underlying syntactic processing. We show how a particular dynamical model, the Visitation Set Gravitation model of Tabor, Juliano, and Tanenhaus (1997), develops syntactic representations and models a set of contingent frequency effects in parsing that are problematic for other models. We also present new simulations showing how the model accounts for semantic effects in parsing, and propose a new account of the (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Fractal Analysis Illuminates the Form of Connectionist Structural Gradualness.Whitney Tabor, Pyeong Whan Cho & Emily Szkudlarek - 2013 - Topics in Cognitive Science 5 (3):634-667.
    We examine two connectionist networks—a fractal learning neural network (FLNN) and a Simple Recurrent Network (SRN)—that are trained to process center-embedded symbol sequences. Previous work provides evidence that connectionist networks trained on infinite-state languages tend to form fractal encodings. Most such work focuses on simple counting recursion cases (e.g., anbn), which are not comparable to the complex recursive patterns seen in natural language syntax. Here, we consider exponential state growth cases (including mirror recursion), describe a new training scheme that seems (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Implicit Acquisition of Grammars With Crossed and Nested Non-Adjacent Dependencies: Investigating the Push-Down Stack Model.Julia Uddén, Martin Ingvar, Peter Hagoort & Karl M. Petersson - 2012 - Cognitive Science 36 (6):1078-1101.
    A recent hypothesis in empirical brain research on language is that the fundamental difference between animal and human communication systems is captured by the distinction between finite-state and more complex phrase-structure grammars, such as context-free and context-sensitive grammars. However, the relevance of this distinction for the study of language as a neurobiological system has been questioned and it has been suggested that a more relevant and partly analogous distinction is that between non-adjacent and adjacent dependencies. Online memory resources are central (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Word Order Typology Interacts With Linguistic Complexity: A Cross‐Linguistic Corpus Study.Himanshu Yadav, Ashwini Vaidya, Vishakha Shukla & Samar Husain - 2020 - Cognitive Science 44 (4):e12822.
    Much previous work has suggested that word order preferences across languages can be explained by the dependency distance minimization constraint (Ferrer‐i Cancho, 2008, 2015; Hawkins, 1994). Consistent with this claim, corpus studies have shown that the average distance between a head (e.g., verb) and its dependent (e.g., noun) tends to be short cross‐linguistically (Ferrer‐i Cancho, 2014; Futrell, Mahowald, & Gibson, 2015; Liu, Xu, & Liang, 2017). This implies that on average languages avoid inefficient or complex structures for simpler structures. But (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Representing Types as Neural Events.Robin Cooper - 2019 - Journal of Logic, Language and Information 28 (2):131-155.
    One of the claims of Type Theory with Records is that it can be used to model types learned by agents in order to classify objects and events in the world, including speech events. That is, the types can be represented by patterns of neural activation in the brain. This claim would be empty if it turns out that the types are in principle impossible to represent on a finite network of neurons. We will discuss how to represent types in (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Under What Conditions Can Recursion Be Learned? Effects of Starting Small in Artificial Grammar Learning of Center‐Embedded Structure.Fenna H. Poletiek, Christopher M. Conway, Michelle R. Ellefson, Jun Lai, Bruno R. Bocanegra & Morten H. Christiansen - 2018 - Cognitive Science 42 (8):2855-2889.
    It has been suggested that external and/or internal limitations paradoxically may lead to superior learning, that is, the concepts of starting small and less is more (Elman, ; Newport, ). In this paper, we explore the type of incremental ordering during training that might help learning, and what mechanism explains this facilitation. We report four artificial grammar learning experiments with human participants. In Experiments 1a and 1b we found a beneficial effect of starting small using two types of simple recursive (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • (1 other version)Dynamical Models of Sentence Processing.M. Loewenstein, W. Tabor & M. K. Tanenhaus - 1999 - Cognitive Science 23 (4):491-515.
    We suggest that the theory of dynamical systems provides a revealing general framework for modeling the representations and mechanism underlying syntactic processing. We show how a particular dynamical model, the Visitation Set Gravitation model of Tabor, Juliano, and Tanenhaus (1997), develops syntactic representations and models a set of contingent frequency effects in parsing that are problematic for other models. We also present new simulations showing how the model accounts for semantic effects in parsing, and propose a new account of the (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Learning the unlearnable: the role of missing evidence.Terry Regier & Susanne Gahl - 2004 - Cognition 93 (2):147-155.
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • (1 other version)Sequential learning and the interaction between biological and linguistic adaptation in language evolution.Florencia Reali & Morten H. Christiansen - 2009 - Interaction Studies 10 (1):5-30.
    It is widely assumed that language in some form or other originated by piggybacking on pre-existing learning mechanism not dedicated to language. Using evolutionary connectionist simulations, we explore the implications of such assumptions by determining the effect of constraints derived from an earlier evolved mechanism for sequential learning on the interaction between biological and linguistic adaptation across generations of language learners. Artificial neural networks were initially allowed to evolve “biologically” to improve their sequential learning abilities, after which language was introduced (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • A Memory‐Based Theory of Verbal Cognition.Simon Dennis - 2005 - Cognitive Science 29 (2):145-193.
    The syntagmatic paradigmatic model is a distributed, memory‐based account of verbal processing. Built on a Bayesian interpretation of string edit theory, it characterizes the control of verbal cognition as the retrieval of sets of syntagmatic and paradigmatic constraints from sequential and relational long‐term memory and the resolution of these constraints in working memory. Lexical information is extracted directly from text using a version of the expectation maximization algorithm. In this article, the model is described and then illustrated on a number (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Symbolically speaking: a connectionist model of sentence production.Franklin Chang - 2002 - Cognitive Science 26 (5):609-651.
    The ability to combine words into novel sentences has been used to argue that humans have symbolic language production abilities. Critiques of connectionist models of language often center on the inability of these models to generalize symbolically (Fodor & Pylyshyn, 1988; Marcus, 1998). To address these issues, a connectionist model of sentence production was developed. The model had variables (role‐concept bindings) that were inspired by spatial representations (Landau & Jackendoff, 1993). In order to take advantage of these variables, a novel (...)
    Download  
     
    Export citation  
     
    Bookmark   24 citations  
  • (1 other version)A Probabilistic Constraints Approach to Language Acquisition and Processing.Mark S. Seidenberg & Maryellen C. MacDonald - 1999 - Cognitive Science 23 (4):569-588.
    This article provides an overview of a probabilistic constraints framework for thinking about language acquisition and processing. The generative approach attempts to characterize knowledge of language (i.e., competence grammar) and then asks how this knowledge is acquired and used. Our approach is performance oriented: the goal is to explain how people comprehend and produce utterances and how children acquire this skill. Use of language involves exploiting multiple probabilistic constraints over various types of linguistic and nonlinguistic information. Acquisition is the process (...)
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  • (1 other version)The Concept of Recursion in Cognitive Studies. Part I: From Mathematics to Cognition.И. Ф Михайлов - 2024 - Philosophical Problems of IT and Cyberspace (PhilIT&C) 1:58-76.
    The paper discusses different approaches to the concept of recursion and its evolution from mathematics to cognitive studies. Such approaches are observed as: self‑embedded structures, multiple hierarchical levels using the same rule, and embedding structures within structures. The paper also discusses the concept of meta‑recursion. Examining meta‑recursion may enable understanding of the ability to apply recursive processes to multilayered hierarchies, with recursive procedures acting as generators. These types of recursive processes could be the fundamental elements of general cognition. The paper (...)
    Download  
     
    Export citation  
     
    Bookmark