Cargando…

Principles of Bayesian Inference Using General Divergence Criteria

When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and...

Descripción completa

Detalles Bibliográficos
Autores principales: Jewson, Jack, Smith, Jim Q., Holmes, Chris
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512964/
https://www.ncbi.nlm.nih.gov/pubmed/33265532
http://dx.doi.org/10.3390/e20060442
_version_ 1783586278851215360
author Jewson, Jack
Smith, Jim Q.
Holmes, Chris
author_facet Jewson, Jack
Smith, Jim Q.
Holmes, Chris
author_sort Jewson, Jack
collection PubMed
description When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes & Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker & Vidyashankar, 2014; Ghosh & Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models.
format Online
Article
Text
id pubmed-7512964
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75129642020-11-09 Principles of Bayesian Inference Using General Divergence Criteria Jewson, Jack Smith, Jim Q. Holmes, Chris Entropy (Basel) Article When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes & Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker & Vidyashankar, 2014; Ghosh & Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models. MDPI 2018-06-06 /pmc/articles/PMC7512964/ /pubmed/33265532 http://dx.doi.org/10.3390/e20060442 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Jewson, Jack
Smith, Jim Q.
Holmes, Chris
Principles of Bayesian Inference Using General Divergence Criteria
title Principles of Bayesian Inference Using General Divergence Criteria
title_full Principles of Bayesian Inference Using General Divergence Criteria
title_fullStr Principles of Bayesian Inference Using General Divergence Criteria
title_full_unstemmed Principles of Bayesian Inference Using General Divergence Criteria
title_short Principles of Bayesian Inference Using General Divergence Criteria
title_sort principles of bayesian inference using general divergence criteria
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512964/
https://www.ncbi.nlm.nih.gov/pubmed/33265532
http://dx.doi.org/10.3390/e20060442
work_keys_str_mv AT jewsonjack principlesofbayesianinferenceusinggeneraldivergencecriteria
AT smithjimq principlesofbayesianinferenceusinggeneraldivergencecriteria
AT holmeschris principlesofbayesianinferenceusinggeneraldivergencecriteria