Cargando…

Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility

During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence bet...

Descripción completa

Detalles Bibliográficos
Autores principales: Park, Hyojin, Kayser, Christoph, Thut, Gregor, Gross, Joachim
Formato: Online Artículo Texto
Lenguaje:English
Publicado: eLife Sciences Publications, Ltd 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4900800/
https://www.ncbi.nlm.nih.gov/pubmed/27146891
http://dx.doi.org/10.7554/eLife.14521
_version_ 1782436701600219136
author Park, Hyojin
Kayser, Christoph
Thut, Gregor
Gross, Joachim
author_facet Park, Hyojin
Kayser, Christoph
Thut, Gregor
Gross, Joachim
author_sort Park, Hyojin
collection PubMed
description During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker’s lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker’s lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing. DOI: http://dx.doi.org/10.7554/eLife.14521.001
format Online
Article
Text
id pubmed-4900800
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher eLife Sciences Publications, Ltd
record_format MEDLINE/PubMed
spelling pubmed-49008002016-06-10 Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility Park, Hyojin Kayser, Christoph Thut, Gregor Gross, Joachim eLife Neuroscience During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker’s lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker’s lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing. DOI: http://dx.doi.org/10.7554/eLife.14521.001 eLife Sciences Publications, Ltd 2016-05-05 /pmc/articles/PMC4900800/ /pubmed/27146891 http://dx.doi.org/10.7554/eLife.14521 Text en © 2016, Park et al http://creativecommons.org/licenses/by/4.0/ This article is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use and redistribution provided that the original author and source are credited.
spellingShingle Neuroscience
Park, Hyojin
Kayser, Christoph
Thut, Gregor
Gross, Joachim
Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility
title Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility
title_full Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility
title_fullStr Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility
title_full_unstemmed Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility
title_short Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility
title_sort lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4900800/
https://www.ncbi.nlm.nih.gov/pubmed/27146891
http://dx.doi.org/10.7554/eLife.14521
work_keys_str_mv AT parkhyojin lipmovementsentraintheobserverslowfrequencybrainoscillationstofacilitatespeechintelligibility
AT kayserchristoph lipmovementsentraintheobserverslowfrequencybrainoscillationstofacilitatespeechintelligibility
AT thutgregor lipmovementsentraintheobserverslowfrequencybrainoscillationstofacilitatespeechintelligibility
AT grossjoachim lipmovementsentraintheobserverslowfrequencybrainoscillationstofacilitatespeechintelligibility