Cargando…
MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
Speech is an intrinsically multisensory signal, and seeing the speaker’s lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension. Previous work debated whether lip signals are mainly process...
Autores principales: | Bröhl, Felix, Keitel, Anne, Kayser, Christoph |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Society for Neuroscience
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9239847/ https://www.ncbi.nlm.nih.gov/pubmed/35728955 http://dx.doi.org/10.1523/ENEURO.0209-22.2022 |
Ejemplares similares
-
Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception
por: Aller, Máté, et al.
Publicado: (2022) -
Visual context due to speech-reading suppresses the auditory response to acoustic interruptions in speech
por: Bhat, Jyoti, et al.
Publicado: (2014) -
ERP data on auditory imagery of native and non-native English speech during silent reading
por: Zhou, Peiyun, et al.
Publicado: (2020) -
Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
por: Suess, Nina, et al.
Publicado: (2022) -
Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features
por: Keitel, Anne, et al.
Publicado: (2018)