Cargando…

Modeling of a Neural System Based on Statistical Mechanics

The minimization of a free energy is often regarded as the key principle in understanding how the brain works and how the brain structure forms. In particular, a statistical-mechanics-based neural network model is expected to allow one to interpret many aspects of the neural firing and learning proc...

Descripción completa

Detalles Bibliográficos
Autores principales: Cho, Myoung Won, Choi, MooYoung
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512410/
https://www.ncbi.nlm.nih.gov/pubmed/33266572
http://dx.doi.org/10.3390/e20110848
_version_ 1783586151515291648
author Cho, Myoung Won
Choi, MooYoung
author_facet Cho, Myoung Won
Choi, MooYoung
author_sort Cho, Myoung Won
collection PubMed
description The minimization of a free energy is often regarded as the key principle in understanding how the brain works and how the brain structure forms. In particular, a statistical-mechanics-based neural network model is expected to allow one to interpret many aspects of the neural firing and learning processes in terms of general concepts and mechanisms in statistical physics. Nevertheless, the definition of the free energy in a neural system is usually an intricate problem without an evident solution. After the pioneering work by Hopfield, several statistical-mechanics-based models have suggested a variety of definition of the free energy or the entropy in a neural system. Among those, the Feynman machine, proposed recently, presents the free energy of a neural system defined via the Feynman path integral formulation with the explicit time variable. In this study, we first give a brief review of the previous relevant models, paying attention to the troublesome problems in them, and examine how the Feynman machine overcomes several vulnerable points in previous models and derives the outcome of the firing or the learning rule in a (biological) neural system as the extremum state in the free energy. Specifically, the model reveals that the biological learning mechanism, called spike-timing-dependent plasticity, relates to the free-energy minimization principle. Basically, computing and learning mechanisms in the Feynman machine base on the exact spike timings of neurons, such as those in a biological neural system. We discuss the consequence of the adoption of an explicit time variable in modeling a neural system and the application of the free-energy minimization principle to understanding the phenomena in the brain.
format Online
Article
Text
id pubmed-7512410
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75124102020-11-09 Modeling of a Neural System Based on Statistical Mechanics Cho, Myoung Won Choi, MooYoung Entropy (Basel) Article The minimization of a free energy is often regarded as the key principle in understanding how the brain works and how the brain structure forms. In particular, a statistical-mechanics-based neural network model is expected to allow one to interpret many aspects of the neural firing and learning processes in terms of general concepts and mechanisms in statistical physics. Nevertheless, the definition of the free energy in a neural system is usually an intricate problem without an evident solution. After the pioneering work by Hopfield, several statistical-mechanics-based models have suggested a variety of definition of the free energy or the entropy in a neural system. Among those, the Feynman machine, proposed recently, presents the free energy of a neural system defined via the Feynman path integral formulation with the explicit time variable. In this study, we first give a brief review of the previous relevant models, paying attention to the troublesome problems in them, and examine how the Feynman machine overcomes several vulnerable points in previous models and derives the outcome of the firing or the learning rule in a (biological) neural system as the extremum state in the free energy. Specifically, the model reveals that the biological learning mechanism, called spike-timing-dependent plasticity, relates to the free-energy minimization principle. Basically, computing and learning mechanisms in the Feynman machine base on the exact spike timings of neurons, such as those in a biological neural system. We discuss the consequence of the adoption of an explicit time variable in modeling a neural system and the application of the free-energy minimization principle to understanding the phenomena in the brain. MDPI 2018-11-05 /pmc/articles/PMC7512410/ /pubmed/33266572 http://dx.doi.org/10.3390/e20110848 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Cho, Myoung Won
Choi, MooYoung
Modeling of a Neural System Based on Statistical Mechanics
title Modeling of a Neural System Based on Statistical Mechanics
title_full Modeling of a Neural System Based on Statistical Mechanics
title_fullStr Modeling of a Neural System Based on Statistical Mechanics
title_full_unstemmed Modeling of a Neural System Based on Statistical Mechanics
title_short Modeling of a Neural System Based on Statistical Mechanics
title_sort modeling of a neural system based on statistical mechanics
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512410/
https://www.ncbi.nlm.nih.gov/pubmed/33266572
http://dx.doi.org/10.3390/e20110848
work_keys_str_mv AT chomyoungwon modelingofaneuralsystembasedonstatisticalmechanics
AT choimooyoung modelingofaneuralsystembasedonstatisticalmechanics