Cargando…
Adult neurogenesis acts as a neural regularizer
New neurons are continuously generated in the subgranular zone of the dentate gyrus throughout adulthood. These new neurons gradually integrate into hippocampal circuits, forming new naive synapses. Viewed from this perspective, these new neurons may represent a significant source of “wiring” noise...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
National Academy of Sciences
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9659416/ https://www.ncbi.nlm.nih.gov/pubmed/36322739 http://dx.doi.org/10.1073/pnas.2206704119 |
_version_ | 1784830193081253888 |
---|---|
author | Tran, Lina M. Santoro, Adam Liu, Lulu Josselyn, Sheena A. Richards, Blake A. Frankland, Paul W. |
author_facet | Tran, Lina M. Santoro, Adam Liu, Lulu Josselyn, Sheena A. Richards, Blake A. Frankland, Paul W. |
author_sort | Tran, Lina M. |
collection | PubMed |
description | New neurons are continuously generated in the subgranular zone of the dentate gyrus throughout adulthood. These new neurons gradually integrate into hippocampal circuits, forming new naive synapses. Viewed from this perspective, these new neurons may represent a significant source of “wiring” noise in hippocampal networks. In machine learning, such noise injection is commonly used as a regularization technique. Regularization techniques help prevent overfitting training data and allow models to generalize learning to new, unseen data. Using a computational modeling approach, here we ask whether a neurogenesis-like process similarly acts as a regularizer, facilitating generalization in a category learning task. In a convolutional neural network (CNN) trained on the CIFAR-10 object recognition dataset, we modeled neurogenesis as a replacement/turnover mechanism, where weights for a randomly chosen small subset of hidden layer neurons were reinitialized to new values as the model learned to categorize 10 different classes of objects. We found that neurogenesis enhanced generalization on unseen test data compared to networks with no neurogenesis. Moreover, neurogenic networks either outperformed or performed similarly to networks with conventional noise injection (i.e., dropout, weight decay, and neural noise). These results suggest that neurogenesis can enhance generalization in hippocampal learning through noise injection, expanding on the roles that neurogenesis may have in cognition. |
format | Online Article Text |
id | pubmed-9659416 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | National Academy of Sciences |
record_format | MEDLINE/PubMed |
spelling | pubmed-96594162023-05-02 Adult neurogenesis acts as a neural regularizer Tran, Lina M. Santoro, Adam Liu, Lulu Josselyn, Sheena A. Richards, Blake A. Frankland, Paul W. Proc Natl Acad Sci U S A Biological Sciences New neurons are continuously generated in the subgranular zone of the dentate gyrus throughout adulthood. These new neurons gradually integrate into hippocampal circuits, forming new naive synapses. Viewed from this perspective, these new neurons may represent a significant source of “wiring” noise in hippocampal networks. In machine learning, such noise injection is commonly used as a regularization technique. Regularization techniques help prevent overfitting training data and allow models to generalize learning to new, unseen data. Using a computational modeling approach, here we ask whether a neurogenesis-like process similarly acts as a regularizer, facilitating generalization in a category learning task. In a convolutional neural network (CNN) trained on the CIFAR-10 object recognition dataset, we modeled neurogenesis as a replacement/turnover mechanism, where weights for a randomly chosen small subset of hidden layer neurons were reinitialized to new values as the model learned to categorize 10 different classes of objects. We found that neurogenesis enhanced generalization on unseen test data compared to networks with no neurogenesis. Moreover, neurogenic networks either outperformed or performed similarly to networks with conventional noise injection (i.e., dropout, weight decay, and neural noise). These results suggest that neurogenesis can enhance generalization in hippocampal learning through noise injection, expanding on the roles that neurogenesis may have in cognition. National Academy of Sciences 2022-11-02 2022-11-08 /pmc/articles/PMC9659416/ /pubmed/36322739 http://dx.doi.org/10.1073/pnas.2206704119 Text en Copyright © 2022 the Author(s). Published by PNAS. https://creativecommons.org/licenses/by-nc-nd/4.0/This article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND) (https://creativecommons.org/licenses/by-nc-nd/4.0/) . |
spellingShingle | Biological Sciences Tran, Lina M. Santoro, Adam Liu, Lulu Josselyn, Sheena A. Richards, Blake A. Frankland, Paul W. Adult neurogenesis acts as a neural regularizer |
title | Adult neurogenesis acts as a neural regularizer |
title_full | Adult neurogenesis acts as a neural regularizer |
title_fullStr | Adult neurogenesis acts as a neural regularizer |
title_full_unstemmed | Adult neurogenesis acts as a neural regularizer |
title_short | Adult neurogenesis acts as a neural regularizer |
title_sort | adult neurogenesis acts as a neural regularizer |
topic | Biological Sciences |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9659416/ https://www.ncbi.nlm.nih.gov/pubmed/36322739 http://dx.doi.org/10.1073/pnas.2206704119 |
work_keys_str_mv | AT tranlinam adultneurogenesisactsasaneuralregularizer AT santoroadam adultneurogenesisactsasaneuralregularizer AT liululu adultneurogenesisactsasaneuralregularizer AT josselynsheenaa adultneurogenesisactsasaneuralregularizer AT richardsblakea adultneurogenesisactsasaneuralregularizer AT franklandpaulw adultneurogenesisactsasaneuralregularizer |