Switch to: References

Add citations

You must login to add citations.
  1. Determinants of Scanpath Regularity in Reading.Titus von der Malsburg, Reinhold Kliegl & Shravan Vasishth - 2015 - Cognitive Science 39 (7):1675-1703.
    Download  
     
    Export citation  
     
    Bookmark  
  • The sausage machine: A new two-stage parsing model.Lyn Frazier & Janet Dean Fodor - 1978 - Cognition 6 (4):291-325.
    Download  
     
    Export citation  
     
    Bookmark   135 citations  
  • Quantifying Structural and Non‐structural Expectations in Relative Clause Processing.Zhong Chen & John T. Hale - 2021 - Cognitive Science 45 (1):e12927.
    Information‐theoretic complexity metrics, such as Surprisal (Hale, 2001; Levy, 2008) and Entropy Reduction (Hale, 2003), are linking hypotheses that bridge theorized expectations about sentences and observed processing difficulty in comprehension. These expectations can be viewed as syntactic derivations constrained by a grammar. However, this expectation‐based view is not limited to syntactic information alone. The present study combines structural and non‐structural information in unified models of word‐by‐word sentence processing difficulty. Using probabilistic minimalist grammars (Stabler, 1997), we extend expectation‐based models to include (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Uncertainty About the Rest of the Sentence.John Hale - 2006 - Cognitive Science 30 (4):643-672.
    A word-by-word human sentence processing complexity metric is presented. This metric formalizes the intuition that comprehenders have more trouble on words contributing larger amounts of information about the syntactic structure of the sentence as a whole. The formalization is in terms of the conditional entropy of grammatical continuations, given the words that have been heard so far. To calculate the predictions of this metric, Wilson and Carroll's (1954) original entropy reduction idea is extended to infinite languages. This is demonstrated with (...)
    Download  
     
    Export citation  
     
    Bookmark   32 citations  
  • What a Rational Parser Would Do.John T. Hale - 2011 - Cognitive Science 35 (3):399-443.
    This article examines cognitive process models of human sentence comprehension based on the idea of informed search. These models are rational in the sense that they strive to find a good syntactic analysis quickly. Informed search derives a new account of garden pathing that handles traditional counterexamples. It supports a symbolic explanation for local coherence as well as an algorithmic account of entropy reduction. The models are expressed in a broad framework for theories of human sentence comprehension.
    Download  
     
    Export citation  
     
    Bookmark   16 citations  
  • The competence-performance distinction in mental philosophy.Raymond J. Nelson - 1978 - Synthese 39 (November):337-382.
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Syntactic structure assembly in human parsing: a computational model based on competitive inhibition and a lexicalist grammar.Theo Vosse & Gerard Kempen - 2000 - Cognition 75 (2):105-143.
    Download  
     
    Export citation  
     
    Bookmark   31 citations  
  • Determinants of Scanpath Regularity in Reading.Titus Malsburg, Reinhold Kliegl & Shravan Vasishth - 2015 - Cognitive Science 39 (7):1675-1703.
    Scanpaths have played an important role in classic research on reading behavior. Nevertheless, they have largely been neglected in later research perhaps due to a lack of suitable analytical tools. Recently, von der Malsburg and Vasishth proposed a new measure for quantifying differences between scanpaths and demonstrated that this measure can recover effects that were missed with the traditional eyetracking measures. However, the sentences used in that study were difficult to process and scanpath effects accordingly strong. The purpose of the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Modeling Structure‐Building in the Brain With CCG Parsing and Large Language Models.Miloš Stanojević, Jonathan R. Brennan, Donald Dunagan, Mark Steedman & John T. Hale - 2023 - Cognitive Science 47 (7):e13312.
    To model behavioral and neural correlates of language comprehension in naturalistic environments, researchers have turned to broad‐coverage tools from natural‐language processing and machine learning. Where syntactic structure is explicitly modeled, prior work has relied predominantly on context‐free grammars (CFGs), yet such formalisms are not sufficiently expressive for human languages. Combinatory categorial grammars (CCGs) are sufficiently expressive directly compositional models of grammar with flexible constituency that affords incremental interpretation. In this work, we evaluate whether a more expressive CCG provides a better (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Parsing as a Cue-Based Retrieval Model.Jakub Dotlačil - 2021 - Cognitive Science 45 (8):e13020.
    This paper develops a novel psycholinguistic parser and tests it against experimental and corpus reading data. The parser builds on the recent research into memory structures, which argues that memory retrieval is content‐addressable and cue‐based. It is shown that the theory of cue‐based memory systems can be combined with transition‐based parsing to produce a parser that, when combined with the cognitive architecture ACT‐R, can model reading and predict online behavioral measures (reading times and regressions). The parser's modeling capacities are tested (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation