Cargando…

Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age

The integration of visual and auditory cues is crucial for successful processing of speech, especially under adverse conditions. Recent reports have shown that when participants watch muted videos of speakers, the phonological information about the acoustic speech envelope, which is associated with...

Descripción completa

Detalles Bibliográficos
Autores principales: Suess, Nina, Hauswald, Anne, Reisinger, Patrick, Rösch, Sebastian, Keitel, Anne, Weisz, Nathan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9627034/
https://www.ncbi.nlm.nih.gov/pubmed/35062025
http://dx.doi.org/10.1093/cercor/bhab518
_version_ 1784822873797427200
author Suess, Nina
Hauswald, Anne
Reisinger, Patrick
Rösch, Sebastian
Keitel, Anne
Weisz, Nathan
author_facet Suess, Nina
Hauswald, Anne
Reisinger, Patrick
Rösch, Sebastian
Keitel, Anne
Weisz, Nathan
author_sort Suess, Nina
collection PubMed
description The integration of visual and auditory cues is crucial for successful processing of speech, especially under adverse conditions. Recent reports have shown that when participants watch muted videos of speakers, the phonological information about the acoustic speech envelope, which is associated with but independent from the speakers’ lip movements, is tracked by the visual cortex. However, the speech signal also carries richer acoustic details, for example, about the fundamental frequency and the resonant frequencies, whose visuophonological transformation could aid speech processing. Here, we investigated the neural basis of the visuo-phonological transformation processes of these more fine-grained acoustic details and assessed how they change as a function of age. We recorded whole-head magnetoencephalographic (MEG) data while the participants watched silent normal (i.e., natural) and reversed videos of a speaker and paid attention to their lip movements. We found that the visual cortex is able to track the unheard natural modulations of resonant frequencies (or formants) and the pitch (or fundamental frequency) linked to lip movements. Importantly, only the processing of natural unheard formants decreases significantly with age in the visual and also in the cingulate cortex. This is not the case for the processing of the unheard speech envelope, the fundamental frequency, or the purely visual information carried by lip movements. These results show that unheard spectral fine details (along with the unheard acoustic envelope) are transformed from a mere visual to a phonological representation. Aging affects especially the ability to derive spectral dynamics at formant frequencies. As listening in noisy environments should capitalize on the ability to track spectral fine details, our results provide a novel focus on compensatory processes in such challenging situations.
format Online
Article
Text
id pubmed-9627034
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-96270342022-11-04 Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age Suess, Nina Hauswald, Anne Reisinger, Patrick Rösch, Sebastian Keitel, Anne Weisz, Nathan Cereb Cortex Original Article The integration of visual and auditory cues is crucial for successful processing of speech, especially under adverse conditions. Recent reports have shown that when participants watch muted videos of speakers, the phonological information about the acoustic speech envelope, which is associated with but independent from the speakers’ lip movements, is tracked by the visual cortex. However, the speech signal also carries richer acoustic details, for example, about the fundamental frequency and the resonant frequencies, whose visuophonological transformation could aid speech processing. Here, we investigated the neural basis of the visuo-phonological transformation processes of these more fine-grained acoustic details and assessed how they change as a function of age. We recorded whole-head magnetoencephalographic (MEG) data while the participants watched silent normal (i.e., natural) and reversed videos of a speaker and paid attention to their lip movements. We found that the visual cortex is able to track the unheard natural modulations of resonant frequencies (or formants) and the pitch (or fundamental frequency) linked to lip movements. Importantly, only the processing of natural unheard formants decreases significantly with age in the visual and also in the cingulate cortex. This is not the case for the processing of the unheard speech envelope, the fundamental frequency, or the purely visual information carried by lip movements. These results show that unheard spectral fine details (along with the unheard acoustic envelope) are transformed from a mere visual to a phonological representation. Aging affects especially the ability to derive spectral dynamics at formant frequencies. As listening in noisy environments should capitalize on the ability to track spectral fine details, our results provide a novel focus on compensatory processes in such challenging situations. Oxford University Press 2022-01-22 /pmc/articles/PMC9627034/ /pubmed/35062025 http://dx.doi.org/10.1093/cercor/bhab518 Text en © The Author(s) 2022. Published by Oxford University Press. https://creativecommons.org/licenses/by/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Article
Suess, Nina
Hauswald, Anne
Reisinger, Patrick
Rösch, Sebastian
Keitel, Anne
Weisz, Nathan
Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
title Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
title_full Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
title_fullStr Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
title_full_unstemmed Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
title_short Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
title_sort cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9627034/
https://www.ncbi.nlm.nih.gov/pubmed/35062025
http://dx.doi.org/10.1093/cercor/bhab518
work_keys_str_mv AT suessnina corticaltrackingofformantmodulationsderivedfromsilentlypresentedlipmovementsanditsdeclinewithage
AT hauswaldanne corticaltrackingofformantmodulationsderivedfromsilentlypresentedlipmovementsanditsdeclinewithage
AT reisingerpatrick corticaltrackingofformantmodulationsderivedfromsilentlypresentedlipmovementsanditsdeclinewithage
AT roschsebastian corticaltrackingofformantmodulationsderivedfromsilentlypresentedlipmovementsanditsdeclinewithage
AT keitelanne corticaltrackingofformantmodulationsderivedfromsilentlypresentedlipmovementsanditsdeclinewithage
AT weisznathan corticaltrackingofformantmodulationsderivedfromsilentlypresentedlipmovementsanditsdeclinewithage