Αρχειοθήκη ιστολογίου

Τετάρτη 14 Μαρτίου 2018

Multisensory integration of speech sounds with letters versus visual speech: Only visual speech induces the mismatch negativity

Abstract

Numerous studies have demonstrated that the vision of lip movements can alter the perception of auditory speech syllables (McGurk effect). While there is ample evidence for integration of text and auditory speech, there are only a few studies on the orthographic equivalent of the McGurk effect. Here, we examined whether written text, like visual speech, can induce an illusory change in the perception of speech sounds on both the behavioral and neural level. In a sound categorization task, we found that both text and visual speech changed the identity of speech sounds from an /aba/-/ada/ continuum, but the size of this audiovisual effect was considerably smaller for text than visual speech. To examine at which level in the information processing hierarchy these multisensory interactions occur, we recorded electroencephalography (EEG) in an audiovisual mismatch negativity (MMN, a component of the event-related potential [ERP] reflecting pre-attentive auditory change detection) paradigm in which deviant text or visual speech was used to induce an illusory change in a sequence of ambiguous sounds halfway between /aba/ and /ada/. We found that only deviant visual speech induced an MMN, but not deviant text, which induced a late P3-like positive potential. These results demonstrate that text has much weaker effects on sound processing than visual speech does, possibly because text has different biological roots than visual speech.

This article is protected by copyright. All rights reserved.



from #ORL-AlexandrosSfakianakis via ola Kala on Inoreader http://ift.tt/2FStsoA

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου