Switch to: References

Add citations

You must login to add citations.
  1. Computers Aren’t Syntax All the Way Down or Content All the Way Up.Cem Bozşahin - 2018 - Minds and Machines 28 (3):543-567.
    This paper argues that the idea of a computer is unique. Calculators and analog computers are not different ideas about computers, and nature does not compute by itself. Computers, once clearly defined in all their terms and mechanisms, rather than enumerated by behavioral examples, can be more than instrumental tools in science, and more than source of analogies and taxonomies in philosophy. They can help us understand semantic content and its relation to form. This can be achieved because they have (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Finding Structure in One Child's Linguistic Experience.Wentao Wang, Wai Keen Vong, Najoung Kim & Brenden M. Lake - 2023 - Cognitive Science 47 (6):e13305.
    Neural network models have recently made striking progress in natural language processing, but they are typically trained on orders of magnitude more language input than children receive. What can these neural networks, which are primarily distributional learners, learn from a naturalistic subset of a single child's experience? We examine this question using a recent longitudinal dataset collected from a single child, consisting of egocentric visual data paired with text transcripts. We train both language-only and vision-and-language neural networks and analyze the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Power of Ignoring: Filtering Input for Argument Structure Acquisition.Laurel Perkins, Naomi H. Feldman & Jeffrey Lidz - 2022 - Cognitive Science 46 (1):e13080.
    Cognitive Science, Volume 46, Issue 1, January 2022.
    Download  
     
    Export citation  
     
    Bookmark  
  • A Single Paradigm for Implicit and Statistical Learning.Padraic Monaghan, Christine Schoetensack & Patrick Rebuschat - 2019 - Topics in Cognitive Science 11 (3):536-554.
    This article focuses on the implicit statistical learning of words and syntax. Monaghan, Schoetensack and Rebuschat introduce a novel paradigm that combines theoretical and methodological insights from the two research traditions, implicit learning and statistical learning. Their cross‐situational learning paradigm has been used in the statistical learning literature, while their measures of awareness have widely been used in implicit learning research. They illustrate how the two literatures can be conjoined in a single paradigm to explore implicit statistical learning.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • The emergence of language.Mark Steedman - 2017 - Mind and Language 32 (5):579-590.
    This paper argues that the faculty of language comes essentially for free in evolutionary terms, by grace of a capacity shared with some evolutionarily quite distantly related animals for deliberatively planning action in the world. The reason humans have language of a kind that animals do not is because of a qualitative difference in the nature of human plans rather than anything unique to language.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Cognitive science in the era of artificial intelligence: A roadmap for reverse-engineering the infant language-learner.Emmanuel Dupoux - 2018 - Cognition 173 (C):43-59.
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • A computational theory of child overextension.Renato Ferreira Pinto & Yang Xu - 2021 - Cognition 206:104472.
    Download  
     
    Export citation  
     
    Bookmark