Cargando…
Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence
Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is appli...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512543/ https://www.ncbi.nlm.nih.gov/pubmed/33266683 http://dx.doi.org/10.3390/e20120959 |
_version_ | 1783586182754467840 |
---|---|
author | Sbert, Mateu Chen, Min Poch, Jordi Bardera, Anton |
author_facet | Sbert, Mateu Chen, Min Poch, Jordi Bardera, Anton |
author_sort | Sbert, Mateu |
collection | PubMed |
description | Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is applied for optimization in machine learning. K-L divergence also stands independently as a commonly used metric for measuring the difference between two distributions. In this paper, we introduce new inequalities regarding cross entropy and K-L divergence by using the fact that cross entropy is the negated logarithm of the weighted geometric mean. We first apply the well-known rearrangement inequality, followed by a recent theorem on weighted Kolmogorov means, and, finally, we introduce a new theorem that directly applies to inequalities between K-L divergences. To illustrate our results, we show numerical examples of distributions. |
format | Online Article Text |
id | pubmed-7512543 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-75125432020-11-09 Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence Sbert, Mateu Chen, Min Poch, Jordi Bardera, Anton Entropy (Basel) Article Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is applied for optimization in machine learning. K-L divergence also stands independently as a commonly used metric for measuring the difference between two distributions. In this paper, we introduce new inequalities regarding cross entropy and K-L divergence by using the fact that cross entropy is the negated logarithm of the weighted geometric mean. We first apply the well-known rearrangement inequality, followed by a recent theorem on weighted Kolmogorov means, and, finally, we introduce a new theorem that directly applies to inequalities between K-L divergences. To illustrate our results, we show numerical examples of distributions. MDPI 2018-12-12 /pmc/articles/PMC7512543/ /pubmed/33266683 http://dx.doi.org/10.3390/e20120959 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Sbert, Mateu Chen, Min Poch, Jordi Bardera, Anton Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence |
title | Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence |
title_full | Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence |
title_fullStr | Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence |
title_full_unstemmed | Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence |
title_short | Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence |
title_sort | some order preserving inequalities for cross entropy and kullback–leibler divergence |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512543/ https://www.ncbi.nlm.nih.gov/pubmed/33266683 http://dx.doi.org/10.3390/e20120959 |
work_keys_str_mv | AT sbertmateu someorderpreservinginequalitiesforcrossentropyandkullbackleiblerdivergence AT chenmin someorderpreservinginequalitiesforcrossentropyandkullbackleiblerdivergence AT pochjordi someorderpreservinginequalitiesforcrossentropyandkullbackleiblerdivergence AT barderaanton someorderpreservinginequalitiesforcrossentropyandkullbackleiblerdivergence |