Switch to: Citations

Add references

You must login to add references.
  1. (1 other version)Prediction in Joint Action: What, When, and Where.Natalie Sebanz & Guenther Knoblich - 2009 - Topics in Cognitive Science 1 (2):353-367.
    Drawing on recent findings in the cognitive and neurosciences, this article discusses how people manage to predict each other’s actions, which is fundamental for joint action. We explore how a common coding of perceived and performed actions may allow actors to predict the what, when, and where of others’ actions. The “what” aspect refers to predictions about the kind of action the other will perform and to the intention that drives the action. The “when” aspect is critical for all joint (...)
    Download  
     
    Export citation  
     
    Bookmark   67 citations  
  • The functional role of cross-frequency coupling.Ryan T. Canolty & Robert T. Knight - 2010 - Trends in Cognitive Sciences 14 (11):506-515.
    Download  
     
    Export citation  
     
    Bookmark   59 citations  
  • Cortical oscillations and sensory predictions.Luc H. Arnal & Anne-Lise Giraud - 2012 - Trends in Cognitive Sciences 16 (7):390-398.
    Download  
     
    Export citation  
     
    Bookmark   45 citations  
  • Effects of Practice and Experience on the Arcuate Fasciculus: Comparing Singers, Instrumentalists, and Non-Musicians.Gus F. Halwani, Psyche Loui, Theodor Rüber & Gottfried Schlaug - 2011 - Frontiers in Psychology 2.
    Download  
     
    Export citation  
     
    Bookmark   25 citations  
  • Oscillatory Brain Responses Reflect Anticipation during Comprehension of Speech Acts in Spoken Dialog.Rosa S. Gisladottir, Sara Bögels & Stephen C. Levinson - 2018 - Frontiers in Human Neuroscience 12:309932.
    Everyday conversation requires listeners to quickly recognize verbal actions, so-called speech acts, from the underspecified linguistic code and prepare a relevant response within the tight time constraints of turn-taking. The goal of this study was to determine the time-course of speech act recognition by investigating oscillatory EEG activity during comprehension of spoken dialog. Participants listened to short, spoken dialogs with target utterances that delivered three distinct speech acts (Answers, Declinations, Pre-offers). The targets were identical across conditions at lexico-syntactic and phonetic/prosodic (...)
    Download  
     
    Export citation  
     
    Bookmark   8 citations  
  • Seeing to hear better: evidence for early audio-visual interactions in speech identification.Jean-Luc Schwartz, Frédéric Berthommier & Christophe Savariaux - 2004 - Cognition 93 (2):69-78.
    Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances sensitivity to acoustic information, decreasing the auditory detection threshold of speech embedded in noise [J. Acoust. Soc. Am. 109 (2001) 2272; J. Acoust. Soc. Am. 108 (2000) 1197]. However, detection is different from comprehension, and it remains (...)
    Download  
     
    Export citation  
     
    Bookmark   17 citations  
  • Cortical Oscillations in Auditory Perception and Speech: Evidence for Two Temporal Windows in Human Auditory Cortex.Huan Luo & David Poeppel - 2012 - Frontiers in Psychology 3.
    Download  
     
    Export citation  
     
    Bookmark   11 citations  
  • When knowing can replace seeing in audiovisual integration of actions.Karin Petrini, Melanie Russell & Frank Pollick - 2009 - Cognition 110 (3):432-439.
    Download  
     
    Export citation  
     
    Bookmark   12 citations  
  • Seeing music performance: Visual influences on perception and experience.William Forde Thompson, Phil Graham & Frank A. Russo - 2005 - Semiotica 2005 (156):203-227.
    Drawing from ethnographic, empirical, and historical / cultural perspectives, we examine the extent to which visual aspects of music contribute to the communication that takes place between performers and their listeners. First, we introduce a framework for understanding how media and genres shape aural and visual experiences of music. Second, we present case studies of two performances, and describe the relation between visual and aural aspects of performance. Third, we report empirical evidence that visual aspects of performance reliably influence perceptions (...)
    Download  
     
    Export citation  
     
    Bookmark   13 citations  
  • Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech.Emmanuel Biau & Salvador Soto-Faraco - 2015 - Frontiers in Human Neuroscience 9.
    Download  
     
    Export citation  
     
    Bookmark   4 citations  
  • Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence.Marzieh Sorati & Dawn Marie Behne - 2019 - Frontiers in Psychology 10.
    Download  
     
    Export citation  
     
    Bookmark   2 citations