Cargando…

The Cortical Representation of Language Timescales is Shared between Reading and Listening

Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether integration pathways in the brain are shared for written a...

Descripción completa

Detalles Bibliográficos
Autores principales: Chen, Catherine, Dupré la Tour, Tom, Gallant, Jack, Klein, Dan, Deniz, Fatma
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Cold Spring Harbor Laboratory 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10418083/
https://www.ncbi.nlm.nih.gov/pubmed/37577530
http://dx.doi.org/10.1101/2023.01.06.522601
Descripción
Sumario:Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether integration pathways in the brain are shared for written and spoken language. To address this issue, we analyzed fMRI BOLD data recorded while participants read and listened to the same narratives in each modality. Levels of the language hierarchy were operationalized as timescales, where each timescale refers to a set of spectral components of a language stimulus. Voxelwise encoding models were used to determine where different timescales are represented across the cerebral cortex, for each modality separately. These models reveal that between the two modalities timescale representations are organized similarly across the cortical surface. Our results suggest that, after low-level sensory processing, language integration proceeds similarly regardless of stimulus modality.