Cargando…

Efficient neural codes naturally emerge through gradient descent learning

Human sensory systems are more sensitive to common features in the environment than uncommon features. For example, small deviations from the more frequently encountered horizontal orientations can be more easily detected than small deviations from the less frequent diagonal ones. Here we find that...

Descripción completa

Detalles Bibliográficos
Autores principales: Benjamin, Ari S., Zhang, Ling-Qi, Qiu, Cheng, Stocker, Alan A., Kording, Konrad P.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9800366/
https://www.ncbi.nlm.nih.gov/pubmed/36581618
http://dx.doi.org/10.1038/s41467-022-35659-7
_version_ 1784861282325757952
author Benjamin, Ari S.
Zhang, Ling-Qi
Qiu, Cheng
Stocker, Alan A.
Kording, Konrad P.
author_facet Benjamin, Ari S.
Zhang, Ling-Qi
Qiu, Cheng
Stocker, Alan A.
Kording, Konrad P.
author_sort Benjamin, Ari S.
collection PubMed
description Human sensory systems are more sensitive to common features in the environment than uncommon features. For example, small deviations from the more frequently encountered horizontal orientations can be more easily detected than small deviations from the less frequent diagonal ones. Here we find that artificial neural networks trained to recognize objects also have patterns of sensitivity that match the statistics of features in images. To interpret these findings, we show mathematically that learning with gradient descent in neural networks preferentially creates representations that are more sensitive to common features, a hallmark of efficient coding. This effect occurs in systems with otherwise unconstrained coding resources, and additionally when learning towards both supervised and unsupervised objectives. This result demonstrates that efficient codes can naturally emerge from gradient-like learning.
format Online
Article
Text
id pubmed-9800366
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-98003662022-12-31 Efficient neural codes naturally emerge through gradient descent learning Benjamin, Ari S. Zhang, Ling-Qi Qiu, Cheng Stocker, Alan A. Kording, Konrad P. Nat Commun Article Human sensory systems are more sensitive to common features in the environment than uncommon features. For example, small deviations from the more frequently encountered horizontal orientations can be more easily detected than small deviations from the less frequent diagonal ones. Here we find that artificial neural networks trained to recognize objects also have patterns of sensitivity that match the statistics of features in images. To interpret these findings, we show mathematically that learning with gradient descent in neural networks preferentially creates representations that are more sensitive to common features, a hallmark of efficient coding. This effect occurs in systems with otherwise unconstrained coding resources, and additionally when learning towards both supervised and unsupervised objectives. This result demonstrates that efficient codes can naturally emerge from gradient-like learning. Nature Publishing Group UK 2022-12-29 /pmc/articles/PMC9800366/ /pubmed/36581618 http://dx.doi.org/10.1038/s41467-022-35659-7 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Benjamin, Ari S.
Zhang, Ling-Qi
Qiu, Cheng
Stocker, Alan A.
Kording, Konrad P.
Efficient neural codes naturally emerge through gradient descent learning
title Efficient neural codes naturally emerge through gradient descent learning
title_full Efficient neural codes naturally emerge through gradient descent learning
title_fullStr Efficient neural codes naturally emerge through gradient descent learning
title_full_unstemmed Efficient neural codes naturally emerge through gradient descent learning
title_short Efficient neural codes naturally emerge through gradient descent learning
title_sort efficient neural codes naturally emerge through gradient descent learning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9800366/
https://www.ncbi.nlm.nih.gov/pubmed/36581618
http://dx.doi.org/10.1038/s41467-022-35659-7
work_keys_str_mv AT benjaminaris efficientneuralcodesnaturallyemergethroughgradientdescentlearning
AT zhanglingqi efficientneuralcodesnaturallyemergethroughgradientdescentlearning
AT qiucheng efficientneuralcodesnaturallyemergethroughgradientdescentlearning
AT stockeralana efficientneuralcodesnaturallyemergethroughgradientdescentlearning
AT kordingkonradp efficientneuralcodesnaturallyemergethroughgradientdescentlearning