Cargando…
Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception
Speech perception in noisy environments is enhanced by seeing facial movements of communication partners. However, the neural mechanisms by which audio and visual speech are combined are not fully understood. We explore MEG phase-locking to auditory and visual signals in MEG recordings from 14 human...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Society for Neuroscience
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9351641/ https://www.ncbi.nlm.nih.gov/pubmed/35760528 http://dx.doi.org/10.1523/JNEUROSCI.2476-21.2022 |
_version_ | 1784762478646788096 |
---|---|
author | Aller, Máté Økland, Heidi Solberg MacGregor, Lucy J. Blank, Helen Davis, Matthew H. |
author_facet | Aller, Máté Økland, Heidi Solberg MacGregor, Lucy J. Blank, Helen Davis, Matthew H. |
author_sort | Aller, Máté |
collection | PubMed |
description | Speech perception in noisy environments is enhanced by seeing facial movements of communication partners. However, the neural mechanisms by which audio and visual speech are combined are not fully understood. We explore MEG phase-locking to auditory and visual signals in MEG recordings from 14 human participants (6 females, 8 males) that reported words from single spoken sentences. We manipulated the acoustic clarity and visual speech signals such that critical speech information is present in auditory, visual, or both modalities. MEG coherence analysis revealed that both auditory and visual speech envelopes (auditory amplitude modulations and lip aperture changes) were phase-locked to 2-6 Hz brain responses in auditory and visual cortex, consistent with entrainment to syllable-rate components. Partial coherence analysis was used to separate neural responses to correlated audio-visual signals and showed non-zero phase-locking to auditory envelope in occipital cortex during audio-visual (AV) speech. Furthermore, phase-locking to auditory signals in visual cortex was enhanced for AV speech compared with audio-only speech that was matched for intelligibility. Conversely, auditory regions of the superior temporal gyrus did not show above-chance partial coherence with visual speech signals during AV conditions but did show partial coherence in visual-only conditions. Hence, visual speech enabled stronger phase-locking to auditory signals in visual areas, whereas phase-locking of visual speech in auditory regions only occurred during silent lip-reading. Differences in these cross-modal interactions between auditory and visual speech signals are interpreted in line with cross-modal predictive mechanisms during speech perception. SIGNIFICANCE STATEMENT Verbal communication in noisy environments is challenging, especially for hearing-impaired individuals. Seeing facial movements of communication partners improves speech perception when auditory signals are degraded or absent. The neural mechanisms supporting lip-reading or audio-visual benefit are not fully understood. Using MEG recordings and partial coherence analysis, we show that speech information is used differently in brain regions that respond to auditory and visual speech. While visual areas use visual speech to improve phase-locking to auditory speech signals, auditory areas do not show phase-locking to visual speech unless auditory speech is absent and visual speech is used to substitute for missing auditory signals. These findings highlight brain processes that combine visual and auditory signals to support speech understanding. |
format | Online Article Text |
id | pubmed-9351641 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Society for Neuroscience |
record_format | MEDLINE/PubMed |
spelling | pubmed-93516412022-08-05 Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception Aller, Máté Økland, Heidi Solberg MacGregor, Lucy J. Blank, Helen Davis, Matthew H. J Neurosci Research Articles Speech perception in noisy environments is enhanced by seeing facial movements of communication partners. However, the neural mechanisms by which audio and visual speech are combined are not fully understood. We explore MEG phase-locking to auditory and visual signals in MEG recordings from 14 human participants (6 females, 8 males) that reported words from single spoken sentences. We manipulated the acoustic clarity and visual speech signals such that critical speech information is present in auditory, visual, or both modalities. MEG coherence analysis revealed that both auditory and visual speech envelopes (auditory amplitude modulations and lip aperture changes) were phase-locked to 2-6 Hz brain responses in auditory and visual cortex, consistent with entrainment to syllable-rate components. Partial coherence analysis was used to separate neural responses to correlated audio-visual signals and showed non-zero phase-locking to auditory envelope in occipital cortex during audio-visual (AV) speech. Furthermore, phase-locking to auditory signals in visual cortex was enhanced for AV speech compared with audio-only speech that was matched for intelligibility. Conversely, auditory regions of the superior temporal gyrus did not show above-chance partial coherence with visual speech signals during AV conditions but did show partial coherence in visual-only conditions. Hence, visual speech enabled stronger phase-locking to auditory signals in visual areas, whereas phase-locking of visual speech in auditory regions only occurred during silent lip-reading. Differences in these cross-modal interactions between auditory and visual speech signals are interpreted in line with cross-modal predictive mechanisms during speech perception. SIGNIFICANCE STATEMENT Verbal communication in noisy environments is challenging, especially for hearing-impaired individuals. Seeing facial movements of communication partners improves speech perception when auditory signals are degraded or absent. The neural mechanisms supporting lip-reading or audio-visual benefit are not fully understood. Using MEG recordings and partial coherence analysis, we show that speech information is used differently in brain regions that respond to auditory and visual speech. While visual areas use visual speech to improve phase-locking to auditory speech signals, auditory areas do not show phase-locking to visual speech unless auditory speech is absent and visual speech is used to substitute for missing auditory signals. These findings highlight brain processes that combine visual and auditory signals to support speech understanding. Society for Neuroscience 2022-08-03 /pmc/articles/PMC9351641/ /pubmed/35760528 http://dx.doi.org/10.1523/JNEUROSCI.2476-21.2022 Text en Copyright © 2022 Aller et al. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed. |
spellingShingle | Research Articles Aller, Máté Økland, Heidi Solberg MacGregor, Lucy J. Blank, Helen Davis, Matthew H. Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception |
title | Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception |
title_full | Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception |
title_fullStr | Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception |
title_full_unstemmed | Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception |
title_short | Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception |
title_sort | differential auditory and visual phase-locking are observed during audio-visual benefit and silent lip-reading for speech perception |
topic | Research Articles |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9351641/ https://www.ncbi.nlm.nih.gov/pubmed/35760528 http://dx.doi.org/10.1523/JNEUROSCI.2476-21.2022 |
work_keys_str_mv | AT allermate differentialauditoryandvisualphaselockingareobservedduringaudiovisualbenefitandsilentlipreadingforspeechperception AT øklandheidisolberg differentialauditoryandvisualphaselockingareobservedduringaudiovisualbenefitandsilentlipreadingforspeechperception AT macgregorlucyj differentialauditoryandvisualphaselockingareobservedduringaudiovisualbenefitandsilentlipreadingforspeechperception AT blankhelen differentialauditoryandvisualphaselockingareobservedduringaudiovisualbenefitandsilentlipreadingforspeechperception AT davismatthewh differentialauditoryandvisualphaselockingareobservedduringaudiovisualbenefitandsilentlipreadingforspeechperception |