Switch to: References

Add citations

You must login to add citations.
  1. Evaluating models of robust word recognition with serial reproduction.Stephan C. Meylan, Sathvik Nair & Thomas L. Griffiths - 2021 - Cognition 210 (C):104553.
    Spoken communication occurs in a “noisy channel” characterized by high levels of environmental noise, variability within and between speakers, and lexical and syntactic ambiguity. Given these properties of the received linguistic input, robust spoken word recognition—and language processing more generally—relies heavily on listeners' prior knowledge to evaluate whether candidate interpretations of that input are more or less likely. Here we compare several broad-coverage probabilistic generative language models in their ability to capture human linguistic expectations. Serial reproduction, an experimental paradigm where (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Geometric Representations for Minimalist Grammars.Peter Beim Graben & Sabrina Gerth - 2012 - Journal of Logic, Language and Information 21 (4):393-432.
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of processing (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Uncertainty About the Rest of the Sentence.John Hale - 2006 - Cognitive Science 30 (4):643-672.
    A word-by-word human sentence processing complexity metric is presented. This metric formalizes the intuition that comprehenders have more trouble on words contributing larger amounts of information about the syntactic structure of the sentence as a whole. The formalization is in terms of the conditional entropy of grammatical continuations, given the words that have been heard so far. To calculate the predictions of this metric, Wilson and Carroll's (1954) original entropy reduction idea is extended to infinite languages. This is demonstrated with (...)
    Download  
     
    Export citation  
     
    Bookmark   32 citations  
  • Artificial Grammar Learning Capabilities in an Abstract Visual Task Match Requirements for Linguistic Syntax.Gesche Westphal-Fitch, Beatrice Giustolisi, Carlo Cecchetto, Jordan S. Martin & W. Tecumseh Fitch - 2018 - Frontiers in Psychology 9:387357.
    Whether pattern-parsing mechanisms are specific to language or apply across multiple cognitive domains remains unresolved. Formal language theory provides a mathematical framework for classifying pattern-generating rule sets (or “grammars”) according to complexity. This framework applies to patterns at any level of complexity, stretching from simple sequences, to highly complex tree-like or net-like structures, to any Turing-computable set of strings. Here, we explored human pattern-processing capabilities in the visual domain by generating abstract visual sequences made up of abstract tiles differing in (...)
    Download  
     
    Export citation  
     
    Bookmark   2 citations