Switch to: References

Add citations

You must login to add citations.
  1. Conditionals in context: Brain signatures of prediction in discourse processing.Mathias Barthel, Rosario Tomasello & Mingya Liu - 2024 - Cognition 242 (C):105635.
    Download  
     
    Export citation  
     
    Bookmark  
  • Overrated gaps: Inter-speaker gaps provide limited information about the timing of turns in conversation.Ruth E. Corps, Birgit Knudsen & Antje S. Meyer - 2022 - Cognition 223 (C):105037.
    Download  
     
    Export citation  
     
    Bookmark  
  • Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians.Marzieh Sorati & Dawn M. Behne - 2021 - Frontiers in Psychology 11.
    Previous research with speech and non-speech stimuli suggested that in audiovisual perception, visual information starting prior to the onset of corresponding sound can provide visual cues, and form a prediction about the upcoming auditory sound. This prediction leads to audiovisual interaction. Auditory and visual perception interact and induce suppression and speeding up of the early auditory event-related potentials such as N1 and P2. To investigate AV interaction, previous research examined N1 and P2 amplitudes and latencies in response to audio only, (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Surface and Contextual Linguistic Cues in Dialog Act Classification: A Cognitive Science View.Guido M. Linders & Max M. Louwerse - 2023 - Cognitive Science 47 (10):e13367.
    What role do linguistic cues on a surface and contextual level have in identifying the intention behind an utterance? Drawing on the wealth of studies and corpora from the computational task of dialog act classification, we studied this question from a cognitive science perspective. We first reviewed the role of linguistic cues in dialog act classification studies that evaluated model performance on three of the most commonly used English dialog act corpora. Findings show that frequency‐based, machine learning, and deep learning (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • The mental representation and social aspect of expressives.Stanley A. Donahoo & Vicky Tzuyin Lai - 2020 - Cognition and Emotion 34 (7):1423-1438.
    Despite increased focus on emotional language, research lacks for the most emotional language: Swearing. We used event-related potentials to investigate whether swear words have content dist...
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  • Audiovisual Modulation in Music Perception for Musicians and Non-musicians.Marzieh Sorati & Dawn Marie Behne - 2020 - Frontiers in Psychology 11.
    Download  
     
    Export citation  
     
    Bookmark   2 citations  
  • Conversational Eyebrow Frowns Facilitate Question Identification: An Online Study Using Virtual Avatars.Naomi Nota, James P. Trujillo & Judith Holler - 2023 - Cognitive Science 47 (12):e13392.
    Conversation is a time-pressured environment. Recognizing a social action (the ‘‘speech act,’’ such as a question requesting information) early is crucial in conversation to quickly understand the intended message and plan a timely response. Fast turns between interlocutors are especially relevant for responses to questions since a long gap may be meaningful by itself. Human language is multimodal, involving speech as well as visual signals from the body, including the face. But little is known about how conversational facial signals contribute (...)
    Download  
     
    Export citation  
     
    Bookmark  
  • Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence.Marzieh Sorati & Dawn Marie Behne - 2019 - Frontiers in Psychology 10.
    Download  
     
    Export citation  
     
    Bookmark   2 citations