Cargando…
The Cortical Representation of Language Timescales is Shared between Reading and Listening
Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether integration pathways in the brain are shared for written a...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Cold Spring Harbor Laboratory
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10418083/ https://www.ncbi.nlm.nih.gov/pubmed/37577530 http://dx.doi.org/10.1101/2023.01.06.522601 |
_version_ | 1785088191819153408 |
---|---|
author | Chen, Catherine Dupré la Tour, Tom Gallant, Jack Klein, Dan Deniz, Fatma |
author_facet | Chen, Catherine Dupré la Tour, Tom Gallant, Jack Klein, Dan Deniz, Fatma |
author_sort | Chen, Catherine |
collection | PubMed |
description | Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether integration pathways in the brain are shared for written and spoken language. To address this issue, we analyzed fMRI BOLD data recorded while participants read and listened to the same narratives in each modality. Levels of the language hierarchy were operationalized as timescales, where each timescale refers to a set of spectral components of a language stimulus. Voxelwise encoding models were used to determine where different timescales are represented across the cerebral cortex, for each modality separately. These models reveal that between the two modalities timescale representations are organized similarly across the cortical surface. Our results suggest that, after low-level sensory processing, language integration proceeds similarly regardless of stimulus modality. |
format | Online Article Text |
id | pubmed-10418083 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Cold Spring Harbor Laboratory |
record_format | MEDLINE/PubMed |
spelling | pubmed-104180832023-08-12 The Cortical Representation of Language Timescales is Shared between Reading and Listening Chen, Catherine Dupré la Tour, Tom Gallant, Jack Klein, Dan Deniz, Fatma bioRxiv Article Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether integration pathways in the brain are shared for written and spoken language. To address this issue, we analyzed fMRI BOLD data recorded while participants read and listened to the same narratives in each modality. Levels of the language hierarchy were operationalized as timescales, where each timescale refers to a set of spectral components of a language stimulus. Voxelwise encoding models were used to determine where different timescales are represented across the cerebral cortex, for each modality separately. These models reveal that between the two modalities timescale representations are organized similarly across the cortical surface. Our results suggest that, after low-level sensory processing, language integration proceeds similarly regardless of stimulus modality. Cold Spring Harbor Laboratory 2023-08-04 /pmc/articles/PMC10418083/ /pubmed/37577530 http://dx.doi.org/10.1101/2023.01.06.522601 Text en https://creativecommons.org/licenses/by-nc-nd/4.0/This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (https://creativecommons.org/licenses/by-nc-nd/4.0/) , which allows reusers to copy and distribute the material in any medium or format in unadapted form only, for noncommercial purposes only, and only so long as attribution is given to the creator. |
spellingShingle | Article Chen, Catherine Dupré la Tour, Tom Gallant, Jack Klein, Dan Deniz, Fatma The Cortical Representation of Language Timescales is Shared between Reading and Listening |
title | The Cortical Representation of Language Timescales is Shared between Reading and Listening |
title_full | The Cortical Representation of Language Timescales is Shared between Reading and Listening |
title_fullStr | The Cortical Representation of Language Timescales is Shared between Reading and Listening |
title_full_unstemmed | The Cortical Representation of Language Timescales is Shared between Reading and Listening |
title_short | The Cortical Representation of Language Timescales is Shared between Reading and Listening |
title_sort | cortical representation of language timescales is shared between reading and listening |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10418083/ https://www.ncbi.nlm.nih.gov/pubmed/37577530 http://dx.doi.org/10.1101/2023.01.06.522601 |
work_keys_str_mv | AT chencatherine thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT duprelatourtom thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT gallantjack thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT kleindan thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT denizfatma thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT chencatherine corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT duprelatourtom corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT gallantjack corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT kleindan corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT denizfatma corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening |