Cargando…
Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure
Successive auditory inputs are rarely independent, their relationships ranging from local transitions between elements to hierarchical and nested representations. In many situations, humans retrieve these dependencies even from limited datasets. However, this learning at multiple scale levels is poo...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
eLife Sciences Publications, Ltd
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10241517/ https://www.ncbi.nlm.nih.gov/pubmed/37129367 http://dx.doi.org/10.7554/eLife.86430 |
_version_ | 1785054001506549760 |
---|---|
author | Benjamin, Lucas Fló, Ana Al Roumi, Fosca Dehaene-Lambertz, Ghislaine |
author_facet | Benjamin, Lucas Fló, Ana Al Roumi, Fosca Dehaene-Lambertz, Ghislaine |
author_sort | Benjamin, Lucas |
collection | PubMed |
description | Successive auditory inputs are rarely independent, their relationships ranging from local transitions between elements to hierarchical and nested representations. In many situations, humans retrieve these dependencies even from limited datasets. However, this learning at multiple scale levels is poorly understood. Here, we used the formalism proposed by network science to study the representation of local and higher-order structures and their interaction in auditory sequences. We show that human adults exhibited biases in their perception of local transitions between elements, which made them sensitive to high-order network structures such as communities. This behavior is consistent with the creation of a parsimonious simplified model from the evidence they receive, achieved by pruning and completing relationships between network elements. This observation suggests that the brain does not rely on exact memories but on a parsimonious representation of the world. Moreover, this bias can be analytically modeled by a memory/efficiency trade-off. This model correctly accounts for previous findings, including local transition probabilities as well as high-order network structures, unifying sequence learning across scales. We finally propose putative brain implementations of such bias. |
format | Online Article Text |
id | pubmed-10241517 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | eLife Sciences Publications, Ltd |
record_format | MEDLINE/PubMed |
spelling | pubmed-102415172023-06-06 Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure Benjamin, Lucas Fló, Ana Al Roumi, Fosca Dehaene-Lambertz, Ghislaine eLife Neuroscience Successive auditory inputs are rarely independent, their relationships ranging from local transitions between elements to hierarchical and nested representations. In many situations, humans retrieve these dependencies even from limited datasets. However, this learning at multiple scale levels is poorly understood. Here, we used the formalism proposed by network science to study the representation of local and higher-order structures and their interaction in auditory sequences. We show that human adults exhibited biases in their perception of local transitions between elements, which made them sensitive to high-order network structures such as communities. This behavior is consistent with the creation of a parsimonious simplified model from the evidence they receive, achieved by pruning and completing relationships between network elements. This observation suggests that the brain does not rely on exact memories but on a parsimonious representation of the world. Moreover, this bias can be analytically modeled by a memory/efficiency trade-off. This model correctly accounts for previous findings, including local transition probabilities as well as high-order network structures, unifying sequence learning across scales. We finally propose putative brain implementations of such bias. eLife Sciences Publications, Ltd 2023-05-02 /pmc/articles/PMC10241517/ /pubmed/37129367 http://dx.doi.org/10.7554/eLife.86430 Text en © 2023, Benjamin et al https://creativecommons.org/licenses/by/4.0/This article is distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use and redistribution provided that the original author and source are credited. |
spellingShingle | Neuroscience Benjamin, Lucas Fló, Ana Al Roumi, Fosca Dehaene-Lambertz, Ghislaine Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure |
title | Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure |
title_full | Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure |
title_fullStr | Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure |
title_full_unstemmed | Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure |
title_short | Humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure |
title_sort | humans parsimoniously represent auditory sequences by pruning and completing the underlying network structure |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10241517/ https://www.ncbi.nlm.nih.gov/pubmed/37129367 http://dx.doi.org/10.7554/eLife.86430 |
work_keys_str_mv | AT benjaminlucas humansparsimoniouslyrepresentauditorysequencesbypruningandcompletingtheunderlyingnetworkstructure AT floana humansparsimoniouslyrepresentauditorysequencesbypruningandcompletingtheunderlyingnetworkstructure AT alroumifosca humansparsimoniouslyrepresentauditorysequencesbypruningandcompletingtheunderlyingnetworkstructure AT dehaenelambertzghislaine humansparsimoniouslyrepresentauditorysequencesbypruningandcompletingtheunderlyingnetworkstructure |