Cargando…
Entropy, Information, and the Updating of Probabilities
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational ag...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8307993/ https://www.ncbi.nlm.nih.gov/pubmed/34356436 http://dx.doi.org/10.3390/e23070895 |
_version_ | 1783728175514124288 |
---|---|
author | Caticha, Ariel |
author_facet | Caticha, Ariel |
author_sort | Caticha, Ariel |
collection | PubMed |
description | This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged role played by the notion of independence in science. The resulting framework—the ME method—can handle arbitrary priors and arbitrary constraints. It includes the MaxEnt and Bayes’ rules as special cases and, therefore, unifies entropic and Bayesian methods into a single general inference scheme. The ME method goes beyond the mere selection of a single posterior, and also addresses the question of how much less probable other distributions might be, which provides a direct bridge to the theories of fluctuations and large deviations. |
format | Online Article Text |
id | pubmed-8307993 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-83079932021-07-25 Entropy, Information, and the Updating of Probabilities Caticha, Ariel Entropy (Basel) Article This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged role played by the notion of independence in science. The resulting framework—the ME method—can handle arbitrary priors and arbitrary constraints. It includes the MaxEnt and Bayes’ rules as special cases and, therefore, unifies entropic and Bayesian methods into a single general inference scheme. The ME method goes beyond the mere selection of a single posterior, and also addresses the question of how much less probable other distributions might be, which provides a direct bridge to the theories of fluctuations and large deviations. MDPI 2021-07-14 /pmc/articles/PMC8307993/ /pubmed/34356436 http://dx.doi.org/10.3390/e23070895 Text en © 2021 by the author. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Caticha, Ariel Entropy, Information, and the Updating of Probabilities |
title | Entropy, Information, and the Updating of Probabilities |
title_full | Entropy, Information, and the Updating of Probabilities |
title_fullStr | Entropy, Information, and the Updating of Probabilities |
title_full_unstemmed | Entropy, Information, and the Updating of Probabilities |
title_short | Entropy, Information, and the Updating of Probabilities |
title_sort | entropy, information, and the updating of probabilities |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8307993/ https://www.ncbi.nlm.nih.gov/pubmed/34356436 http://dx.doi.org/10.3390/e23070895 |
work_keys_str_mv | AT catichaariel entropyinformationandtheupdatingofprobabilities |