Cargando…

An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex

Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus...

Descripción completa

Detalles Bibliográficos
Autores principales: Okada, Kayoko, Venezia, Jonathan H., Matchin, William, Saberi, Kourosh, Hickok, Gregory
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3689691/
https://www.ncbi.nlm.nih.gov/pubmed/23805332
http://dx.doi.org/10.1371/journal.pone.0068959
_version_ 1782274289347592192
author Okada, Kayoko
Venezia, Jonathan H.
Matchin, William
Saberi, Kourosh
Hickok, Gregory
author_facet Okada, Kayoko
Venezia, Jonathan H.
Matchin, William
Saberi, Kourosh
Hickok, Gregory
author_sort Okada, Kayoko
collection PubMed
description Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus (pSTS) and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices (such as STS). Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex.
format Online
Article
Text
id pubmed-3689691
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-36896912013-06-26 An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex Okada, Kayoko Venezia, Jonathan H. Matchin, William Saberi, Kourosh Hickok, Gregory PLoS One Research Article Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus (pSTS) and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices (such as STS). Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex. Public Library of Science 2013-06-21 /pmc/articles/PMC3689691/ /pubmed/23805332 http://dx.doi.org/10.1371/journal.pone.0068959 Text en © 2013 Okada et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Okada, Kayoko
Venezia, Jonathan H.
Matchin, William
Saberi, Kourosh
Hickok, Gregory
An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
title An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
title_full An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
title_fullStr An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
title_full_unstemmed An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
title_short An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
title_sort fmri study of audiovisual speech perception reveals multisensory interactions in auditory cortex
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3689691/
https://www.ncbi.nlm.nih.gov/pubmed/23805332
http://dx.doi.org/10.1371/journal.pone.0068959
work_keys_str_mv AT okadakayoko anfmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT veneziajonathanh anfmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT matchinwilliam anfmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT saberikourosh anfmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT hickokgregory anfmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT okadakayoko fmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT veneziajonathanh fmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT matchinwilliam fmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT saberikourosh fmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex
AT hickokgregory fmristudyofaudiovisualspeechperceptionrevealsmultisensoryinteractionsinauditorycortex