Cargando…
Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity
Several studies demonstrate that the structure of the brain increases in hierarchical complexity throughout development. We tested if the structure of artificial neural networks also increases in hierarchical complexity while learning a developing task, called the balance beam problem. Previous simu...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10470958/ https://www.ncbi.nlm.nih.gov/pubmed/37651418 http://dx.doi.org/10.1371/journal.pone.0290743 |
_version_ | 1785099798840344576 |
---|---|
author | Leite, Sofia Mota, Bruno Silva, António Ramos Commons, Michael Lamport Miller, Patrice Marie Rodrigues, Pedro Pereira |
author_facet | Leite, Sofia Mota, Bruno Silva, António Ramos Commons, Michael Lamport Miller, Patrice Marie Rodrigues, Pedro Pereira |
author_sort | Leite, Sofia |
collection | PubMed |
description | Several studies demonstrate that the structure of the brain increases in hierarchical complexity throughout development. We tested if the structure of artificial neural networks also increases in hierarchical complexity while learning a developing task, called the balance beam problem. Previous simulations of this developmental task do not reflect a necessary premise underlying development: a more complex structure can be built out of less complex ones, while ensuring that the more complex structure does not replace the less complex one. In order to address this necessity, we segregated the input set by subsets of increasing Orders of Hierarchical Complexity. This is a complexity measure that has been extensively shown to underlie the complexity behavior and hypothesized to underlie the complexity of the neural structure of the brain. After segregating the input set, minimal neural network models were trained separately for each input subset, and adjacent complexity models were analyzed sequentially to observe whether there was a structural progression. Results show that three different network structural progressions were found, performing with similar accuracy, pointing towards self-organization. Also, more complex structures could be built out of less complex ones without substituting them, successfully addressing catastrophic forgetting and leveraging performance of previous models in the literature. Furthermore, the model structures trained on the two highest complexity subsets performed better than simulations of the balance beam present in the literature. As a major contribution, this work was successful in addressing hierarchical complexity structural growth in neural networks, and is the first that segregates inputs by Order of Hierarchical Complexity. Since this measure can be applied to all domains of data, the present method can be applied to future simulations, systematizing the simulation of developmental and evolutionary structural growth in neural networks. |
format | Online Article Text |
id | pubmed-10470958 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-104709582023-09-01 Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity Leite, Sofia Mota, Bruno Silva, António Ramos Commons, Michael Lamport Miller, Patrice Marie Rodrigues, Pedro Pereira PLoS One Research Article Several studies demonstrate that the structure of the brain increases in hierarchical complexity throughout development. We tested if the structure of artificial neural networks also increases in hierarchical complexity while learning a developing task, called the balance beam problem. Previous simulations of this developmental task do not reflect a necessary premise underlying development: a more complex structure can be built out of less complex ones, while ensuring that the more complex structure does not replace the less complex one. In order to address this necessity, we segregated the input set by subsets of increasing Orders of Hierarchical Complexity. This is a complexity measure that has been extensively shown to underlie the complexity behavior and hypothesized to underlie the complexity of the neural structure of the brain. After segregating the input set, minimal neural network models were trained separately for each input subset, and adjacent complexity models were analyzed sequentially to observe whether there was a structural progression. Results show that three different network structural progressions were found, performing with similar accuracy, pointing towards self-organization. Also, more complex structures could be built out of less complex ones without substituting them, successfully addressing catastrophic forgetting and leveraging performance of previous models in the literature. Furthermore, the model structures trained on the two highest complexity subsets performed better than simulations of the balance beam present in the literature. As a major contribution, this work was successful in addressing hierarchical complexity structural growth in neural networks, and is the first that segregates inputs by Order of Hierarchical Complexity. Since this measure can be applied to all domains of data, the present method can be applied to future simulations, systematizing the simulation of developmental and evolutionary structural growth in neural networks. Public Library of Science 2023-08-31 /pmc/articles/PMC10470958/ /pubmed/37651418 http://dx.doi.org/10.1371/journal.pone.0290743 Text en © 2023 Leite et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Leite, Sofia Mota, Bruno Silva, António Ramos Commons, Michael Lamport Miller, Patrice Marie Rodrigues, Pedro Pereira Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity |
title | Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity |
title_full | Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity |
title_fullStr | Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity |
title_full_unstemmed | Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity |
title_short | Hierarchical growth in neural networks structure: Organizing inputs by Order of Hierarchical Complexity |
title_sort | hierarchical growth in neural networks structure: organizing inputs by order of hierarchical complexity |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10470958/ https://www.ncbi.nlm.nih.gov/pubmed/37651418 http://dx.doi.org/10.1371/journal.pone.0290743 |
work_keys_str_mv | AT leitesofia hierarchicalgrowthinneuralnetworksstructureorganizinginputsbyorderofhierarchicalcomplexity AT motabruno hierarchicalgrowthinneuralnetworksstructureorganizinginputsbyorderofhierarchicalcomplexity AT silvaantonioramos hierarchicalgrowthinneuralnetworksstructureorganizinginputsbyorderofhierarchicalcomplexity AT commonsmichaellamport hierarchicalgrowthinneuralnetworksstructureorganizinginputsbyorderofhierarchicalcomplexity AT millerpatricemarie hierarchicalgrowthinneuralnetworksstructureorganizinginputsbyorderofhierarchicalcomplexity AT rodriguespedropereira hierarchicalgrowthinneuralnetworksstructureorganizinginputsbyorderofhierarchicalcomplexity |