Switch to: References

Add citations

You must login to add citations.
  1. Visual statistical learning is facilitated in Zipfian distributions.Ori Lavi-Rotbain & Inbal Arnon - 2021 - Cognition 206 (C):104492.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Cross-situational learning in a Zipfian environment.Andrew T. Hendrickson & Amy Perfors - 2019 - Cognition 189 (C):11-22.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • What Mechanisms Underlie Implicit Statistical Learning? Transitional Probabilities Versus Chunks in Language Learning.Pierre Perruchet - 2019 - Topics in Cognitive Science 11 (3):520-535.
    In 2006, Perruchet and Pacton (2006) asked whether implicit learning and statistical learning represent two approaches to the same phenomenon. This article represents an important follow‐up to their seminal review article. As in the previous paper, the focus is on the formation of elementary cognitive units. Both approaches favor different explanations on what these units consist of and how they are formed. Perruchet weighs up the evidence for different explanations and concludes with a helpful agenda for future research.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Throwing out the Bayesian baby with the optimal bathwater: Response to Endress.Michael C. Frank - 2013 - Cognition 128 (3):417-423.
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • The learnability consequences of Zipfian distributions in language.Ori Lavi-Rotbain & Inbal Arnon - 2022 - Cognition 223 (C):105038.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Semantic Coherence Facilitates Distributional Learning.Ouyang Long, Boroditsky Lera & C. Frank Michael - 2017 - Cognitive Science 41 (S4):855-884.
    Computational models have shown that purely statistical knowledge about words’ linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that “postman” and “mailman” are semantically similar because they have quantitatively similar patterns of association with other words. In contrast to these computational results, artificial language learning experiments suggest that distributional statistics alone do not facilitate learning of linguistic categories. However, experiments in this paradigm expose participants to entirely novel words, (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Five Ways in Which Computational Modeling Can Help Advance Cognitive Science: Lessons From Artificial Grammar Learning.Willem Zuidema, Robert M. French, Raquel G. Alhama, Kevin Ellis, Timothy J. O'Donnell, Tim Sainburg & Timothy Q. Gentner - 2020 - Topics in Cognitive Science 12 (3):925-941.
    Zuidema et al. illustrate how empirical AGL studies can benefit from computational models and techniques. Computational models can help clarifying theories, and thus in delineating research questions, but also in facilitating experimental design, stimulus generation, and data analysis. The authors show, with a series of examples, how computational modeling can be integrated with empirical AGL approaches, and how model selection techniques can indicate the most likely model to explain experimental outcomes.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Inherent and probabilistic naturalness.Luca Gasparri - 2024 - Philosophical Studies 181 (2):369-385.
    Standard accounts hold that regularities of behavior must be arbitrary to constitute a convention. Yet, there is growing consensus that conventionality is a graded phenomenon, and that conventions can be more or less natural. I develop an account of natural conventions that distinguishes two basic dimensions of conventional naturalness: a probabilistic dimension and an inherent one. A convention is probabilistically natural if it is likely to emerge in a population of agents, and inherently natural if its content is a regularity (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Zipfian distributions facilitate children's learning of novel word-referent mappings.Lucie Wolters, Ori Lavi-Rotbain & Inbal Arnon - 2024 - Cognition 253 (C):105932.
    Download  
     
    Export citation  
     
    Bookmark