Cargando…
Generalizing Information to the Evolution of Rational Belief
Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullba...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516412/ https://www.ncbi.nlm.nih.gov/pubmed/33285882 http://dx.doi.org/10.3390/e22010108 |
_version_ | 1783586995086295040 |
---|---|
author | Duersch, Jed A. Catanach, Thomas A. |
author_facet | Duersch, Jed A. Catanach, Thomas A. |
author_sort | Duersch, Jed A. |
collection | PubMed |
description | Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback–Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches. |
format | Online Article Text |
id | pubmed-7516412 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-75164122020-11-09 Generalizing Information to the Evolution of Rational Belief Duersch, Jed A. Catanach, Thomas A. Entropy (Basel) Article Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback–Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches. MDPI 2020-01-16 /pmc/articles/PMC7516412/ /pubmed/33285882 http://dx.doi.org/10.3390/e22010108 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Duersch, Jed A. Catanach, Thomas A. Generalizing Information to the Evolution of Rational Belief |
title | Generalizing Information to the Evolution of Rational Belief |
title_full | Generalizing Information to the Evolution of Rational Belief |
title_fullStr | Generalizing Information to the Evolution of Rational Belief |
title_full_unstemmed | Generalizing Information to the Evolution of Rational Belief |
title_short | Generalizing Information to the Evolution of Rational Belief |
title_sort | generalizing information to the evolution of rational belief |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516412/ https://www.ncbi.nlm.nih.gov/pubmed/33285882 http://dx.doi.org/10.3390/e22010108 |
work_keys_str_mv | AT duerschjeda generalizinginformationtotheevolutionofrationalbelief AT catanachthomasa generalizinginformationtotheevolutionofrationalbelief |