Cargando…
The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design
We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Ca...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514425/ http://dx.doi.org/10.3390/e21111081 |
_version_ | 1783586585182208000 |
---|---|
author | Oladyshkin, Sergey Nowak, Wolfgang |
author_facet | Oladyshkin, Sergey Nowak, Wolfgang |
author_sort | Oladyshkin, Sergey |
collection | PubMed |
description | We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling. |
format | Online Article Text |
id | pubmed-7514425 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-75144252020-11-09 The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design Oladyshkin, Sergey Nowak, Wolfgang Entropy (Basel) Article We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling. MDPI 2019-11-04 /pmc/articles/PMC7514425/ http://dx.doi.org/10.3390/e21111081 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Oladyshkin, Sergey Nowak, Wolfgang The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_full | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_fullStr | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_full_unstemmed | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_short | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_sort | connection between bayesian inference and information theory for model selection, information gain and experimental design |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514425/ http://dx.doi.org/10.3390/e21111081 |
work_keys_str_mv | AT oladyshkinsergey theconnectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign AT nowakwolfgang theconnectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign AT oladyshkinsergey connectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign AT nowakwolfgang connectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign |