Switch to: References

Add citations

You must login to add citations.
  1. Pitches that Wire Together Fire Together: Scale Degree Associations Across Time Predict Melodic Expectations.Niels J. Verosky & Emily Morgan - 2021 - Cognitive Science 45 (10):e13037.
    The ongoing generation of expectations is fundamental to listeners’ experience of music, but research into types of statistical information that listeners extract from musical melodies has tended to emphasize transition probabilities and n‐grams, with limited consideration given to other types of statistical learning that may be relevant. Temporal associations between scale degrees represent a different type of information present in musical melodies that can be learned from musical corpora using expectation networks, a computationally simple method based on activation and decay. (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Do Humans Really Learn A n B n Artificial Grammars From Exemplars?Jean-Rémy Hochmann, Mahan Azadpour & Jacques Mehler - 2008 - Cognitive Science 32 (6):1021-1036.
    An important topic in the evolution of language is the kinds of grammars that can be computed by humans and other animals. Fitch and Hauser () approached this question by assessing the ability of different species to learn 2 grammars, (AB)n and An Bn. An Bn was taken to indicate a phrase structure grammar, eliciting a center‐embedded pattern. (AB)n indicates a grammar whose strings entail only local relations between the categories of constituents. F&H's data suggest that humans, but not tamarin (...)
    Download  
     
    Export citation  
     
    Bookmark   6 citations  
  • iMinerva: A Mathematical Model of Distributional Statistical Learning.Erik D. Thiessen & Philip I. Pavlik - 2013 - Cognitive Science 37 (2):310-343.
    Statistical learning refers to the ability to identify structure in the input based on its statistical properties. For many linguistic structures, the relevant statistical features are distributional: They are related to the frequency and variability of exemplars in the input. These distributional regularities have been suggested to play a role in many different aspects of language learning, including phonetic categories, using phonemic distinctions in word learning, and discovering non-adjacent relations. On the surface, these different aspects share few commonalities. Despite this, (...)
    Download  
     
    Export citation  
     
    Bookmark   7 citations  
  • Non‐adjacent Dependency Learning in Humans and Other Animals.Benjamin Wilson, Michelle Spierings, Andrea Ravignani, Jutta L. Mueller, Toben H. Mintz, Frank Wijnen, Anne Kant, Kenny Smith & Arnaud Rey - 2020 - Topics in Cognitive Science 12 (3):843-858.
    Wilson et al. focus on one class of AGL tasks: the cognitively demanding task of detecting non‐adjacent dependencies (NADs) among items. They provide a typology of the different types of NADs in natural languages and in AGL tasks. A range of cues affect NAD learning, ranging from the variability and number of intervening elements to the presence of shared prosodic cues between the dependent items. These cues, important for humans to discover non‐adjacent dependencies, are also found to facilitate NAD learning (...)
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • Non‐adjacent Dependencies Processing in Human and Non‐human Primates.Raphaëlle Malassis, Arnaud Rey & Joël Fagot - 2018 - Cognitive Science 42 (5):1677-1699.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Does complexity matter? Meta-analysis of learner performance in artificial grammar tasks.Rachel Schiff & Pesia Katan - 2014 - Frontiers in Psychology 5.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Of words and whistles: Statistical learning operates similarly for identical sounds perceived as speech and non-speech.Sierra J. Sweet, Stephen C. Van Hedger & Laura J. Batterink - 2024 - Cognition 242 (C):105649.
    Download  
     
    Export citation  
     
    Bookmark  
  • A Comparative Perspective on the Role of Acoustic Cues in Detecting Language Structure.Jutta L. Mueller, Carel ten Cate & Juan M. Toro - 2018 - Topics in Cognitive Science 12 (3):859-874.
    Mueller et al. discuss the role of acoustic cues in detecting language structure more generally. Across languages, there are clear links between acoustic cues and syntactic structure. They show that AGL experiments implementing analogous links demonstrate that prosodic cues, as well as various auditory biases, facilitate the learning of structural rules. Some of these biases, e.g. for auditory grouping, are also present in other species.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Non‐adjacent Dependency Learning in Humans and Other Animals.Benjamin Wilson, Michelle Spierings, Andrea Ravignani, Jutta L. Mueller, Toben H. Mintz, Frank Wijnen, Anne van der Kant, Kenny Smith & Arnaud Rey - 2018 - Topics in Cognitive Science 12 (3):843-858.
    Wilson et al. focus on one class of AGL tasks: the cognitively demanding task of detecting non‐adjacent dependencies (NADs) among items. They provide a typology of the different types of NADs in natural languages and in AGL tasks. A range of cues affect NAD learning, ranging from the variability and number of intervening elements to the presence of shared prosodic cues between the dependent items. These cues, important for humans to discover non‐adjacent dependencies, are also found to facilitate NAD learning (...)
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • When forgetting fosters learning: A neural network model for statistical learning.Ansgar D. Endress & Scott P. Johnson - 2021 - Cognition 213 (C):104621.
    Download  
     
    Export citation  
     
    Bookmark   5 citations  
  • Simultaneous segmentation and generalisation of non-adjacent dependencies from continuous speech.Rebecca L. A. Frost & Padraic Monaghan - 2016 - Cognition 147 (C):70-74.
    Download  
     
    Export citation  
     
    Bookmark   19 citations  
  • Neurocognitive mechanisms of statistical-sequential learning: what do event-related potentials tell us?Jerome Daltrozzo & Christopher M. Conway - 2014 - Frontiers in Human Neuroscience 8.
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • Birth of an Abstraction: A Dynamical Systems Account of the Discovery of an Elsewhere Principle in a Category Learning Task.Whitney Tabor, Pyeong W. Cho & Harry Dankowicz - 2013 - Cognitive Science 37 (7):1193-1227.
    Human participants and recurrent (“connectionist”) neural networks were both trained on a categorization system abstractly similar to natural language systems involving irregular (“strong”) classes and a default class. Both the humans and the networks exhibited staged learning and a generalization pattern reminiscent of the Elsewhere Condition (Kiparsky, 1973). Previous connectionist accounts of related phenomena have often been vague about the nature of the networks’ encoding systems. We analyzed our network using dynamical systems theory, revealing topological and geometric properties that can (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Attentional effects on rule extraction and consolidation from speech.Diana López-Barroso, David Cucurell, Antoni Rodríguez-Fornells & Ruth de Diego-Balaguer - 2016 - Cognition 152:61-69.
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  • 8-Month-Old infants' Ability to Process Word Order is Shaped by the Amount of Exposure.Caterina Marino & Judit Gervain - 2021 - Cognition 213 (C):104717.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • All Together Now: Concurrent Learning of Multiple Structures in an Artificial Language.Alexa R. Romberg & Jenny R. Saffran - 2013 - Cognitive Science 37 (7):1290-1320.
    Natural languages contain many layers of sequential structure, from the distribution of phonemes within words to the distribution of phrases within utterances. However, most research modeling language acquisition using artificial languages has focused on only one type of distributional structure at a time. In two experiments, we investigated adult learning of an artificial language that contains dependencies between both adjacent and non-adjacent words. We found that learners rapidly acquired both types of regularities and that the strength of the adjacent statistics (...)
    Download  
     
    Export citation  
     
    Bookmark   14 citations  
  • How Many Mechanisms Are Needed to Analyze Speech? A Connectionist Simulation of Structural Rule Learning in Artificial Language Acquisition.Aarre Laakso & Paco Calvo - 2011 - Cognitive Science 35 (7):1243-1281.
    Some empirical evidence in the artificial language acquisition literature has been taken to suggest that statistical learning mechanisms are insufficient for extracting structural information from an artificial language. According to the more than one mechanism (MOM) hypothesis, at least two mechanisms are required in order to acquire language from speech: (a) a statistical mechanism for speech segmentation; and (b) an additional rule-following mechanism in order to induce grammatical regularities. In this article, we present a set of neural network studies demonstrating (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Frequency-based organization of speech sequences in a nonhuman animal.Juan M. Toro, Marina Nespor & Judit Gervain - 2016 - Cognition 146 (C):1-7.
    Download  
     
    Export citation  
     
    Bookmark   3 citations