Switch to: References

Add citations

You must login to add citations.
  1. Quantifying Structural and Non‐structural Expectations in Relative Clause Processing.Zhong Chen & John T. Hale - 2021 - Cognitive Science 45 (1):e12927.
    Information‐theoretic complexity metrics, such as Surprisal (Hale, 2001; Levy, 2008) and Entropy Reduction (Hale, 2003), are linking hypotheses that bridge theorized expectations about sentences and observed processing difficulty in comprehension. These expectations can be viewed as syntactic derivations constrained by a grammar. However, this expectation‐based view is not limited to syntactic information alone. The present study combines structural and non‐structural information in unified models of word‐by‐word sentence processing difficulty. Using probabilistic minimalist grammars (Stabler, 1997), we extend expectation‐based models to include (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Single‐Stage Prediction Models Do Not Explain the Magnitude of Syntactic Disambiguation Difficulty.Marten van Schijndel & Tal Linzen - 2021 - Cognitive Science 45 (6):e12988.
    The disambiguation of a syntactically ambiguous sentence in favor of a less preferred parse can lead to slower reading at the disambiguation point. This phenomenon, referred to as a garden‐path effect, has motivated models in which readers initially maintain only a subset of the possible parses of the sentence, and subsequently require time‐consuming reanalysis to reconstruct a discarded parse. A more recent proposal argues that the garden‐path effect can be reduced to surprisal arising in a fully parallel parser: words consistent (...)
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Is the Mind Inherently Predicting? Exploring Forward and Backward Looking in Language Processing.Luca Onnis, Alfred Lim, Shirley Cheung & Falk Huettig - 2022 - Cognitive Science 46 (10):e13201.
    Cognitive Science, Volume 46, Issue 10, October 2022.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Bayesian Surprise Predicts Human Event Segmentation in Story Listening.Manoj Kumar, Ariel Goldstein, Sebastian Michelmann, Jeffrey M. Zacks, Uri Hasson & Kenneth A. Norman - 2023 - Cognitive Science 47 (10):e13343.
    Event segmentation theory posits that people segment continuous experience into discrete events and that event boundaries occur when there are large transient increases in prediction error. Here, we set out to test this theory in the context of story listening, by using a deep learning language model (GPT‐2) to compute the predicted probability distribution of the next word, at each point in the story. For three stories, we used the probability distributions generated by GPT‐2 to compute the time series of (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Greater entropy leads to more explicit referential forms during language production.Hossein Karimi - 2022 - Cognition 225 (C):105093.
    Download  
     
    Export citation  
     
    Bookmark