Switch to: References

Add citations

You must login to add citations.
  1. Chunking Versus Transitional Probabilities: Differentiating Between Theories of Statistical Learning.Samantha N. Emerson & Christopher M. Conway - 2023 - Cognitive Science 47 (5):e13284.
    There are two main approaches to how statistical patterns are extracted from sequences: The transitional probability approach proposes that statistical learning occurs through the computation of probabilities between items in a sequence. The chunking approach, including models such as PARSER and TRACX, proposes that units are extracted as chunks. Importantly, the chunking approach suggests that the extraction of full units weakens the processing of subunits while the transitional probability approach suggests that both units and subunits should strengthen. Previous findings using (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Finding Hierarchical Structure in Binary Sequences: Evidence from Lindenmayer Grammar Learning.Samuel Schmid, Douglas Saddy & Julie Franck - 2023 - Cognitive Science 47 (1):e13242.
    In this article, we explore the extraction of recursive nested structure in the processing of binary sequences. Our aim was to determine whether humans learn the higher-order regularities of a highly simplified input where only sequential-order information marks the hierarchical structure. To this end, we implemented a sequence generated by the Fibonacci grammar in a serial reaction time task. This deterministic grammar generates aperiodic but self-similar sequences. The combination of these two properties allowed us to evaluate hierarchical learning while controlling (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Statistically Induced Chunking Recall: A Memory‐Based Approach to Statistical Learning.Erin S. Isbilen, Stewart M. McCauley, Evan Kidd & Morten H. Christiansen - 2020 - Cognitive Science 44 (7):e12848.
    The computations involved in statistical learning have long been debated. Here, we build on work suggesting that a basic memory process, chunking, may account for the processing of statistical regularities into larger units. Drawing on methods from the memory literature, we developed a novel paradigm to test statistical learning by leveraging a robust phenomenon observed in serial recall tasks: that short‐term memory is fundamentally shaped by long‐term distributional learning. In the statistically induced chunking recall (SICR) task, participants are exposed to (...)
    Download  
     
    Export citation  
     
    Bookmark   9 citations  
  • Statistical learning and memory.Ansgar D. Endress, Lauren K. Slone & Scott P. Johnson - 2020 - Cognition 204 (C):104346.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • What exactly is learned in visual statistical learning? Insights from Bayesian modeling.Noam Siegelman, Louisa Bogaerts, Blair C. Armstrong & Ram Frost - 2019 - Cognition 192 (C):104002.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • What Mechanisms Underlie Implicit Statistical Learning? Transitional Probabilities Versus Chunks in Language Learning.Pierre Perruchet - 2019 - Topics in Cognitive Science 11 (3):520-535.
    In 2006, Perruchet and Pacton (2006) asked whether implicit learning and statistical learning represent two approaches to the same phenomenon. This article represents an important follow‐up to their seminal review article. As in the previous paper, the focus is on the formation of elementary cognitive units. Both approaches favor different explanations on what these units consist of and how they are formed. Perruchet weighs up the evidence for different explanations and concludes with a helpful agenda for future research.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • A Recurrent Connectionist Model of Melody Perception: An Exploration Using TRACX2.Daniel Defays, Robert M. French & Barbara Tillmann - 2023 - Cognitive Science 47 (4):e13283.
    Are similar, or even identical, mechanisms used in the computational modeling of speech segmentation, serial image processing, and music processing? We address this question by exploring how TRACX2, a recognition‐based, recursive connectionist autoencoder model of chunking and sequence segmentation, which has successfully simulated speech and serial‐image processing, might be applied to elementary melody perception. The model, a three‐layer autoencoder that recognizes “chunks” of short sequences of intervals that have been frequently encountered on input, is trained on the tone intervals of (...)
    Download  
     
    Export citation  
     
    Bookmark