Cargando…
Revisiting Chernoff Information with Likelihood Ratio Exponential Families
The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the di...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9601539/ https://www.ncbi.nlm.nih.gov/pubmed/37420420 http://dx.doi.org/10.3390/e24101400 |
_version_ | 1784817090895544320 |
---|---|
author | Nielsen, Frank |
author_facet | Nielsen, Frank |
author_sort | Nielsen, Frank |
collection | PubMed |
description | The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback–Leibler divergence. In this paper, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to (i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing, (ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices and (iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions. |
format | Online Article Text |
id | pubmed-9601539 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-96015392022-10-27 Revisiting Chernoff Information with Likelihood Ratio Exponential Families Nielsen, Frank Entropy (Basel) Article The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback–Leibler divergence. In this paper, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to (i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing, (ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices and (iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions. MDPI 2022-10-01 /pmc/articles/PMC9601539/ /pubmed/37420420 http://dx.doi.org/10.3390/e24101400 Text en © 2022 by the author. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Nielsen, Frank Revisiting Chernoff Information with Likelihood Ratio Exponential Families |
title | Revisiting Chernoff Information with Likelihood Ratio Exponential Families |
title_full | Revisiting Chernoff Information with Likelihood Ratio Exponential Families |
title_fullStr | Revisiting Chernoff Information with Likelihood Ratio Exponential Families |
title_full_unstemmed | Revisiting Chernoff Information with Likelihood Ratio Exponential Families |
title_short | Revisiting Chernoff Information with Likelihood Ratio Exponential Families |
title_sort | revisiting chernoff information with likelihood ratio exponential families |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9601539/ https://www.ncbi.nlm.nih.gov/pubmed/37420420 http://dx.doi.org/10.3390/e24101400 |
work_keys_str_mv | AT nielsenfrank revisitingchernoffinformationwithlikelihoodratioexponentialfamilies |