Cargando…

Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech

How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the ro...

Descripción completa

Detalles Bibliográficos
Autores principales: Biau, Emmanuel, Kotz, Sonja A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6207805/
https://www.ncbi.nlm.nih.gov/pubmed/30405383
http://dx.doi.org/10.3389/fnhum.2018.00434
_version_ 1783366588943040512
author Biau, Emmanuel
Kotz, Sonja A.
author_facet Biau, Emmanuel
Kotz, Sonja A.
author_sort Biau, Emmanuel
collection PubMed
description How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (~20 Hz), generally associated with motor functions, as an amodal central coordinator that receives bottom-up delta-theta copies from specific sensory areas and generate top-down temporal predictions for auditory entrainment. Dissociating temporal prediction from entrainment may explain how and why visual input benefits speech processing rather than adding cognitive load in multimodal speech perception. On the one hand, body movements convey prosodic and syllabic features at delta and theta rates (i.e., 1–3 Hz and 4–7 Hz). On the other hand, the natural precedence of visual input before auditory onsets may prepare the brain to anticipate and facilitate the integration of auditory delta-theta copies of the prosodic-syllabic structure. Here, we identify three fundamental criteria based on recent evidence and hypotheses, which support the notion that lower motor beta frequency may play a central and generic role in temporal prediction during speech perception. First, beta activity must respond to rhythmic stimulation across modalities. Second, beta power must respond to biological motion and speech-related movements conveying temporal information in multimodal speech processing. Third, temporal prediction may recruit a communication loop between motor and primary auditory cortices (PACs) via delta-to-beta cross-frequency coupling. We discuss evidence related to each criterion and extend these concepts to a beta-motivated framework of multimodal speech processing.
format Online
Article
Text
id pubmed-6207805
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-62078052018-11-07 Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech Biau, Emmanuel Kotz, Sonja A. Front Hum Neurosci Neuroscience How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (~20 Hz), generally associated with motor functions, as an amodal central coordinator that receives bottom-up delta-theta copies from specific sensory areas and generate top-down temporal predictions for auditory entrainment. Dissociating temporal prediction from entrainment may explain how and why visual input benefits speech processing rather than adding cognitive load in multimodal speech perception. On the one hand, body movements convey prosodic and syllabic features at delta and theta rates (i.e., 1–3 Hz and 4–7 Hz). On the other hand, the natural precedence of visual input before auditory onsets may prepare the brain to anticipate and facilitate the integration of auditory delta-theta copies of the prosodic-syllabic structure. Here, we identify three fundamental criteria based on recent evidence and hypotheses, which support the notion that lower motor beta frequency may play a central and generic role in temporal prediction during speech perception. First, beta activity must respond to rhythmic stimulation across modalities. Second, beta power must respond to biological motion and speech-related movements conveying temporal information in multimodal speech processing. Third, temporal prediction may recruit a communication loop between motor and primary auditory cortices (PACs) via delta-to-beta cross-frequency coupling. We discuss evidence related to each criterion and extend these concepts to a beta-motivated framework of multimodal speech processing. Frontiers Media S.A. 2018-10-24 /pmc/articles/PMC6207805/ /pubmed/30405383 http://dx.doi.org/10.3389/fnhum.2018.00434 Text en Copyright © 2018 Biau and Kotz. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Biau, Emmanuel
Kotz, Sonja A.
Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech
title Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech
title_full Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech
title_fullStr Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech
title_full_unstemmed Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech
title_short Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech
title_sort lower beta: a central coordinator of temporal prediction in multimodal speech
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6207805/
https://www.ncbi.nlm.nih.gov/pubmed/30405383
http://dx.doi.org/10.3389/fnhum.2018.00434
work_keys_str_mv AT biauemmanuel lowerbetaacentralcoordinatoroftemporalpredictioninmultimodalspeech
AT kotzsonjaa lowerbetaacentralcoordinatoroftemporalpredictioninmultimodalspeech