Switch to: References

Add citations

You must login to add citations.
  1. Chunk formation in immediate memory and how it relates to data compression.Mustapha Chekaf, Nelson Cowan & Fabien Mathy - 2016 - Cognition 155 (C):96-107.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Chunks, Schemata, and Retrieval Structures: Past and Current Computational Models.Fernand Gobet, Peter C. R. Lane & Martyn Lloyd-Kelly - 2015 - Frontiers in Psychology 6.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • An integrated theory of language production and comprehension.Martin J. Pickering & Simon Garrod - 2013 - Behavioral and Brain Sciences 36 (4):329-347.
    Currently, production and comprehension are regarded as quite distinct in accounts of language processing. In rejecting this dichotomy, we instead assert that producing and understanding are interwoven, and that this interweaving is what enables people to predict themselves and each other. We start by noting that production and comprehension are forms of action and action perception. We then consider the evidence for interweaving in action, action perception, and joint action, and explain such evidence in terms of prediction. Specifically, we assume (...)
    Download  
     
    Export citation  
     
    Bookmark   159 citations  
  • (1 other version)Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2014 - Cognitive Science 38 (4):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • Finding Hierarchical Structure in Binary Sequences: Evidence from Lindenmayer Grammar Learning.Samuel Schmid, Douglas Saddy & Julie Franck - 2023 - Cognitive Science 47 (1):e13242.
    In this article, we explore the extraction of recursive nested structure in the processing of binary sequences. Our aim was to determine whether humans learn the higher-order regularities of a highly simplified input where only sequential-order information marks the hierarchical structure. To this end, we implemented a sequence generated by the Fibonacci grammar in a serial reaction time task. This deterministic grammar generates aperiodic but self-similar sequences. The combination of these two properties allowed us to evaluate hierarchical learning while controlling (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • (1 other version)Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2015 - Cognitive Science 39 (2):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural‐language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Predictive Movements and Human Reinforcement Learning of Sequential Action.Roy de Kleijn, George Kachergis & Bernhard Hommel - 2018 - Cognitive Science 42 (S3):783-808.
    Sequential action makes up the bulk of human daily activity, and yet much remains unknown about how people learn such actions. In one motor learning paradigm, the serial reaction time (SRT) task, people are taught a consistent sequence of button presses by cueing them with the next target response. However, the SRT task only records keypress response times to a cued target, and thus it cannot reveal the full time‐course of motion, including predictive movements. This paper describes a mouse movement (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Temporal Dynamics of Regularity Extraction in Non‐Human Primates.Laure Minier, Joël Fagot & Arnaud Rey - 2016 - Cognitive Science 40 (4):1019-1030.
    Extracting the regularities of our environment is one of our core cognitive abilities. To study the fine-grained dynamics of the extraction of embedded regularities, a method combining the advantages of the artificial language paradigm and the serial response time task was used with a group of Guinea baboons in a new automatic experimental device. After a series of random trials, monkeys were exposed to language-like patterns. We found that the extraction of embedded patterns positioned at the end of larger patterns (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Parallel Distributed Processing at 25: Further Explorations in the Microstructure of Cognition.Timothy T. Rogers & James L. McClelland - 2014 - Cognitive Science 38 (6):1024-1077.
    This paper introduces a special issue of Cognitive Science initiated on the 25th anniversary of the publication of Parallel Distributed Processing (PDP), a two-volume work that introduced the use of neural network models as vehicles for understanding cognition. The collection surveys the core commitments of the PDP framework, the key issues the framework has addressed, and the debates the framework has spawned, and presents viewpoints on the current status of these issues. The articles focus on both historical roots and contemporary (...)
    Download  
     
    Export citation  
     
    Bookmark   18 citations  
  • Five Ways in Which Computational Modeling Can Help Advance Cognitive Science: Lessons From Artificial Grammar Learning.Willem Zuidema, Robert M. French, Raquel G. Alhama, Kevin Ellis, Timothy J. O'Donnell, Tim Sainburg & Timothy Q. Gentner - 2020 - Topics in Cognitive Science 12 (3):925-941.
    Zuidema et al. illustrate how empirical AGL studies can benefit from computational models and techniques. Computational models can help clarifying theories, and thus in delineating research questions, but also in facilitating experimental design, stimulus generation, and data analysis. The authors show, with a series of examples, how computational modeling can be integrated with empirical AGL approaches, and how model selection techniques can indicate the most likely model to explain experimental outcomes.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Zipfian frequency distributions facilitate word segmentation in context.Chigusa Kurumada, Stephan C. Meylan & Michael C. Frank - 2013 - Cognition 127 (3):439-453.
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Statistically Induced Chunking Recall: A Memory‐Based Approach to Statistical Learning.Erin S. Isbilen, Stewart M. McCauley, Evan Kidd & Morten H. Christiansen - 2020 - Cognitive Science 44 (7):e12848.
    The computations involved in statistical learning have long been debated. Here, we build on work suggesting that a basic memory process, chunking, may account for the processing of statistical regularities into larger units. Drawing on methods from the memory literature, we developed a novel paradigm to test statistical learning by leveraging a robust phenomenon observed in serial recall tasks: that short‐term memory is fundamentally shaped by long‐term distributional learning. In the statistically induced chunking recall (SICR) task, participants are exposed to (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Regularity Extraction Across Species: Associative Learning Mechanisms Shared by Human and Non‐Human Primates.Arnaud Rey, Laure Minier, Raphaëlle Malassis, Louisa Bogaerts & Joël Fagot - 2019 - Topics in Cognitive Science 11 (3):573-586.
    One of the themes that has been widely addressed in both the implicit learning and statistical learning literatures is that of rule learning. While it is widely agreed that the extraction of regularities from the environment is a fundamental facet of cognition, there is still debate about the nature of rule learning. Rey and colleagues show that the comparison between human and non‐human primates can contribute important insights to this debate.
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • The nature of the memory buffer in implicit learning: Learning Chinese tonal symmetries.Feifei Li, Shan Jiang, Xiuyan Guo, Zhiliang Yang & Zoltan Dienes - 2013 - Consciousness and Cognition 22 (3):920-930.
    Previous research has established that people can implicitly learn chunks, which do not require a memory buffer to process. The present study explores the implicit learning of nonlocal dependencies generated by higher than finite-state grammars, specifically, Chinese tonal retrogrades and inversions , which do require buffers . People were asked to listen to and memorize artificial poetry instantiating one of the two grammars; after this training phase, people were informed of the existence of rules and asked to classify new poems, (...)
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Prediction plays a key role in language development as well as processing.Matt A. Johnson, Nicholas B. Turk-Browne & Adele E. Goldberg - 2013 - Behavioral and Brain Sciences 36 (4):360-361.
    Although the target article emphasizes the important role of prediction in language use, prediction may well also play a key role in the initial formation of linguistic representations, that is, in language development. We outline the role of prediction in three relevant language-learning domains: transitional probabilities, statistical preemption, and construction learning.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Forward models and their implications for production, comprehension, and dialogue.Martin J. Pickering & Simon Garrod - 2013 - Behavioral and Brain Sciences 36 (4):377-392.
    Our target article proposed that language production and comprehension are interwoven, with speakers making predictions of their own utterances and comprehenders making predictions of other people's utterances at different linguistic levels. Here, we respond to comments about such issues as cognitive architecture and its neural basis, learning and development, monitoring, the nature of forward models, communicative intentions, and dialogue.
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Computational Modeling in Cognitive Science: A Manifesto for Change.Caspar Addyman & Robert M. French - 2012 - Topics in Cognitive Science 4 (3):332-341.
    Computational modeling has long been one of the traditional pillars of cognitive science. Unfortunately, the computer models of cognition being developed today have not kept up with the enormous changes that have taken place in computer technology and, especially, in human-computer interfaces. For all intents and purposes, modeling is still done today as it was 25, or even 35, years ago. Everyone still programs in his or her own favorite programming language, source code is rarely made available, accessibility of models (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • A Recurrent Connectionist Model of Melody Perception: An Exploration Using TRACX2.Daniel Defays, Robert M. French & Barbara Tillmann - 2023 - Cognitive Science 47 (4):e13283.
    Are similar, or even identical, mechanisms used in the computational modeling of speech segmentation, serial image processing, and music processing? We address this question by exploring how TRACX2, a recognition‐based, recursive connectionist autoencoder model of chunking and sequence segmentation, which has successfully simulated speech and serial‐image processing, might be applied to elementary melody perception. The model, a three‐layer autoencoder that recognizes “chunks” of short sequences of intervals that have been frequently encountered on input, is trained on the tone intervals of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Input and Age‐Dependent Variation in Second Language Learning: A Connectionist Account.Marius Janciauskas & Franklin Chang - 2018 - Cognitive Science 42 (S2):519-554.
    Language learning requires linguistic input, but several studies have found that knowledge of second language rules does not seem to improve with more language exposure. One reason for this is that previous studies did not factor out variation due to the different rules tested. To examine this issue, we reanalyzed grammaticality judgment scores in Flege, Yeni-Komshian, and Liu's study of L2 learners using rule-related predictors and found that, in addition to the overall drop in performance due to a sensitive period, (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • Grouping in working memory guides chunk formation in long-term memory: Evidence from the Hebb effect.Philipp Musfeld, Joscha Dutli, Klaus Oberauer & Lea M. Bartsch - 2024 - Cognition 248 (C):105795.
    Download  
     
    Export citation  
     
    Bookmark  
  • Chunking Versus Transitional Probabilities: Differentiating Between Theories of Statistical Learning.Samantha N. Emerson & Christopher M. Conway - 2023 - Cognitive Science 47 (5):e13284.
    There are two main approaches to how statistical patterns are extracted from sequences: The transitional probability approach proposes that statistical learning occurs through the computation of probabilities between items in a sequence. The chunking approach, including models such as PARSER and TRACX, proposes that units are extracted as chunks. Importantly, the chunking approach suggests that the extraction of full units weakens the processing of subunits while the transitional probability approach suggests that both units and subunits should strengthen. Previous findings using (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Modeling the Influence of Language Input Statistics on Children's Speech Production.Ingeborg Roete, Stefan L. Frank, Paula Fikkert & Marisa Casillas - 2020 - Cognitive Science 44 (12):e12924.
    We trained a computational model (the Chunk-Based Learner; CBL) on a longitudinal corpus of child–caregiver interactions in English to test whether one proposed statistical learning mechanism—backward transitional probability—is able to predict children's speech productions with stable accuracy throughout the first few years of development. We predicted that the model less accurately reconstructs children's speech productions as they grow older because children gradually begin to generate speech using abstracted forms rather than specific “chunks” from their speech environment. To test this idea, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Simultaneous segmentation and generalisation of non-adjacent dependencies from continuous speech.Rebecca L. A. Frost & Padraic Monaghan - 2016 - Cognition 147 (C):70-74.
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • Interactive Effects of Explicit Emergent Structure: A Major Challenge for Cognitive Computational Modeling.Robert M. French & Elizabeth Thomas - 2015 - Topics in Cognitive Science 7 (2):206-216.
    David Marr's (1982) three‐level analysis of computational cognition argues for three distinct levels of cognitive information processing—namely, the computational, representational, and implementational levels. But Marr's levels are—and were meant to be—descriptive, rather than interactive and dynamic. For this reason, we suggest that, had Marr been writing today, he might well have gone even farther in his analysis, including the emergence of structure—in particular, explicit structure at the conceptual level—from lower levels, and the effect of explicit emergent structures on the level (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Language experience changes subsequent learning.Luca Onnis & Erik Thiessen - 2013 - Cognition 126 (2):268-284.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • The influence of children’s exposure to language from two to six years: The case of nonword repetition.Gary Jones - 2016 - Cognition 153 (C):79-88.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Learning Higher‐Order Transitional Probabilities in Nonhuman Primates.Arnaud Rey, Joël Fagot, Fabien Mathy, Laura Lazartigues, Laure Tosatto, Guillem Bonafos, Jean-Marc Freyermuth & Frédéric Lavigne - 2022 - Cognitive Science 46 (4):e13121.
    Cognitive Science, Volume 46, Issue 4, April 2022.
    Download  
     
    Export citation  
     
    Bookmark  
  • What Mechanisms Underlie Implicit Statistical Learning? Transitional Probabilities Versus Chunks in Language Learning.Pierre Perruchet - 2019 - Topics in Cognitive Science 11 (3):520-535.
    In 2006, Perruchet and Pacton (2006) asked whether implicit learning and statistical learning represent two approaches to the same phenomenon. This article represents an important follow‐up to their seminal review article. As in the previous paper, the focus is on the formation of elementary cognitive units. Both approaches favor different explanations on what these units consist of and how they are formed. Perruchet weighs up the evidence for different explanations and concludes with a helpful agenda for future research.
    Download  
     
    Export citation  
     
    Bookmark   5 citations