Cargando…

A Measure of Information Available for Inference

The mutual information between the state of a neural network and the state of the external world represents the amount of information stored in the neural network that is associated with the external world. In contrast, the surprise of the sensory input indicates the unpredictability of the current...

Descripción completa

Detalles Bibliográficos
Autor principal: Isomura, Takuya
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7513032/
https://www.ncbi.nlm.nih.gov/pubmed/33265602
http://dx.doi.org/10.3390/e20070512
_version_ 1783586294457171968
author Isomura, Takuya
author_facet Isomura, Takuya
author_sort Isomura, Takuya
collection PubMed
description The mutual information between the state of a neural network and the state of the external world represents the amount of information stored in the neural network that is associated with the external world. In contrast, the surprise of the sensory input indicates the unpredictability of the current input. In other words, this is a measure of inference ability, and an upper bound of the surprise is known as the variational free energy. According to the free-energy principle (FEP), a neural network continuously minimizes the free energy to perceive the external world. For the survival of animals, inference ability is considered to be more important than simply memorized information. In this study, the free energy is shown to represent the gap between the amount of information stored in the neural network and that available for inference. This concept involves both the FEP and the infomax principle, and will be a useful measure for quantifying the amount of information available for inference.
format Online
Article
Text
id pubmed-7513032
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75130322020-11-09 A Measure of Information Available for Inference Isomura, Takuya Entropy (Basel) Article The mutual information between the state of a neural network and the state of the external world represents the amount of information stored in the neural network that is associated with the external world. In contrast, the surprise of the sensory input indicates the unpredictability of the current input. In other words, this is a measure of inference ability, and an upper bound of the surprise is known as the variational free energy. According to the free-energy principle (FEP), a neural network continuously minimizes the free energy to perceive the external world. For the survival of animals, inference ability is considered to be more important than simply memorized information. In this study, the free energy is shown to represent the gap between the amount of information stored in the neural network and that available for inference. This concept involves both the FEP and the infomax principle, and will be a useful measure for quantifying the amount of information available for inference. MDPI 2018-07-07 /pmc/articles/PMC7513032/ /pubmed/33265602 http://dx.doi.org/10.3390/e20070512 Text en © 2018 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Isomura, Takuya
A Measure of Information Available for Inference
title A Measure of Information Available for Inference
title_full A Measure of Information Available for Inference
title_fullStr A Measure of Information Available for Inference
title_full_unstemmed A Measure of Information Available for Inference
title_short A Measure of Information Available for Inference
title_sort measure of information available for inference
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7513032/
https://www.ncbi.nlm.nih.gov/pubmed/33265602
http://dx.doi.org/10.3390/e20070512
work_keys_str_mv AT isomuratakuya ameasureofinformationavailableforinference
AT isomuratakuya measureofinformationavailableforinference