Cargando…
Canonical neural networks perform active inference
This work considers a class of canonical neural networks comprising rate coding models, wherein neural activity and plasticity minimise a common cost function—and plasticity is modulated with a certain delay. We show that such neural networks implicitly perform active inference and learning to minim...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8760273/ https://www.ncbi.nlm.nih.gov/pubmed/35031656 http://dx.doi.org/10.1038/s42003-021-02994-2 |
_version_ | 1784633282528280576 |
---|---|
author | Isomura, Takuya Shimazaki, Hideaki Friston, Karl J. |
author_facet | Isomura, Takuya Shimazaki, Hideaki Friston, Karl J. |
author_sort | Isomura, Takuya |
collection | PubMed |
description | This work considers a class of canonical neural networks comprising rate coding models, wherein neural activity and plasticity minimise a common cost function—and plasticity is modulated with a certain delay. We show that such neural networks implicitly perform active inference and learning to minimise the risk associated with future outcomes. Mathematical analyses demonstrate that this biological optimisation can be cast as maximisation of model evidence, or equivalently minimisation of variational free energy, under the well-known form of a partially observed Markov decision process model. This equivalence indicates that the delayed modulation of Hebbian plasticity—accompanied with adaptation of firing thresholds—is a sufficient neuronal substrate to attain Bayes optimal inference and control. We corroborated this proposition using numerical analyses of maze tasks. This theory offers a universal characterisation of canonical neural networks in terms of Bayesian belief updating and provides insight into the neuronal mechanisms underlying planning and adaptive behavioural control. |
format | Online Article Text |
id | pubmed-8760273 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-87602732022-01-26 Canonical neural networks perform active inference Isomura, Takuya Shimazaki, Hideaki Friston, Karl J. Commun Biol Article This work considers a class of canonical neural networks comprising rate coding models, wherein neural activity and plasticity minimise a common cost function—and plasticity is modulated with a certain delay. We show that such neural networks implicitly perform active inference and learning to minimise the risk associated with future outcomes. Mathematical analyses demonstrate that this biological optimisation can be cast as maximisation of model evidence, or equivalently minimisation of variational free energy, under the well-known form of a partially observed Markov decision process model. This equivalence indicates that the delayed modulation of Hebbian plasticity—accompanied with adaptation of firing thresholds—is a sufficient neuronal substrate to attain Bayes optimal inference and control. We corroborated this proposition using numerical analyses of maze tasks. This theory offers a universal characterisation of canonical neural networks in terms of Bayesian belief updating and provides insight into the neuronal mechanisms underlying planning and adaptive behavioural control. Nature Publishing Group UK 2022-01-14 /pmc/articles/PMC8760273/ /pubmed/35031656 http://dx.doi.org/10.1038/s42003-021-02994-2 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Isomura, Takuya Shimazaki, Hideaki Friston, Karl J. Canonical neural networks perform active inference |
title | Canonical neural networks perform active inference |
title_full | Canonical neural networks perform active inference |
title_fullStr | Canonical neural networks perform active inference |
title_full_unstemmed | Canonical neural networks perform active inference |
title_short | Canonical neural networks perform active inference |
title_sort | canonical neural networks perform active inference |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8760273/ https://www.ncbi.nlm.nih.gov/pubmed/35031656 http://dx.doi.org/10.1038/s42003-021-02994-2 |
work_keys_str_mv | AT isomuratakuya canonicalneuralnetworksperformactiveinference AT shimazakihideaki canonicalneuralnetworksperformactiveinference AT fristonkarlj canonicalneuralnetworksperformactiveinference |