Cargando…

MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading

Speech is an intrinsically multisensory signal, and seeing the speaker’s lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension. Previous work debated whether lip signals are mainly process...

Descripción completa

Detalles Bibliográficos
Autores principales: Bröhl, Felix, Keitel, Anne, Kayser, Christoph
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Society for Neuroscience 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9239847/
https://www.ncbi.nlm.nih.gov/pubmed/35728955
http://dx.doi.org/10.1523/ENEURO.0209-22.2022
_version_ 1784737401128615936
author Bröhl, Felix
Keitel, Anne
Kayser, Christoph
author_facet Bröhl, Felix
Keitel, Anne
Kayser, Christoph
author_sort Bröhl, Felix
collection PubMed
description Speech is an intrinsically multisensory signal, and seeing the speaker’s lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension. Previous work debated whether lip signals are mainly processed along the auditory pathways or whether the visual system directly implements speech-related processes. To probe this, we systematically characterized dynamic representations of multiple acoustic and visual speech-derived features in source localized MEG recordings that were obtained while participants listened to speech or viewed silent speech. Using a mutual-information framework we provide a comprehensive assessment of how well temporal and occipital cortices reflect the physically presented signals and unique aspects of acoustic features that were physically absent but may be critical for comprehension. Our results demonstrate that both cortices feature a functionally specific form of multisensory restoration: during lip reading, they reflect unheard acoustic features, independent of co-existing representations of the visible lip movements. This restoration emphasizes the unheard pitch signature in occipital cortex and the speech envelope in temporal cortex and is predictive of lip-reading performance. These findings suggest that when seeing the speaker’s lips, the brain engages both visual and auditory pathways to support comprehension by exploiting multisensory correspondences between lip movements and spectro-temporal acoustic cues.
format Online
Article
Text
id pubmed-9239847
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Society for Neuroscience
record_format MEDLINE/PubMed
spelling pubmed-92398472022-06-29 MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading Bröhl, Felix Keitel, Anne Kayser, Christoph eNeuro Research Article: New Research Speech is an intrinsically multisensory signal, and seeing the speaker’s lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension. Previous work debated whether lip signals are mainly processed along the auditory pathways or whether the visual system directly implements speech-related processes. To probe this, we systematically characterized dynamic representations of multiple acoustic and visual speech-derived features in source localized MEG recordings that were obtained while participants listened to speech or viewed silent speech. Using a mutual-information framework we provide a comprehensive assessment of how well temporal and occipital cortices reflect the physically presented signals and unique aspects of acoustic features that were physically absent but may be critical for comprehension. Our results demonstrate that both cortices feature a functionally specific form of multisensory restoration: during lip reading, they reflect unheard acoustic features, independent of co-existing representations of the visible lip movements. This restoration emphasizes the unheard pitch signature in occipital cortex and the speech envelope in temporal cortex and is predictive of lip-reading performance. These findings suggest that when seeing the speaker’s lips, the brain engages both visual and auditory pathways to support comprehension by exploiting multisensory correspondences between lip movements and spectro-temporal acoustic cues. Society for Neuroscience 2022-06-27 /pmc/articles/PMC9239847/ /pubmed/35728955 http://dx.doi.org/10.1523/ENEURO.0209-22.2022 Text en Copyright © 2022 Bröhl et al. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.
spellingShingle Research Article: New Research
Bröhl, Felix
Keitel, Anne
Kayser, Christoph
MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
title MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
title_full MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
title_fullStr MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
title_full_unstemmed MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
title_short MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
title_sort meg activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading
topic Research Article: New Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9239847/
https://www.ncbi.nlm.nih.gov/pubmed/35728955
http://dx.doi.org/10.1523/ENEURO.0209-22.2022
work_keys_str_mv AT brohlfelix megactivityinvisualandauditorycorticesrepresentsacousticspeechrelatedinformationduringsilentlipreading
AT keitelanne megactivityinvisualandauditorycorticesrepresentsacousticspeechrelatedinformationduringsilentlipreading
AT kayserchristoph megactivityinvisualandauditorycorticesrepresentsacousticspeechrelatedinformationduringsilentlipreading