Cargando…
Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence
Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is appli...
Autores principales: | Sbert, Mateu, Chen, Min, Poch, Jordi, Bardera, Anton |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512543/ https://www.ncbi.nlm.nih.gov/pubmed/33266683 http://dx.doi.org/10.3390/e20120959 |
Ejemplares similares
-
Kullback–Leibler divergence and the Pareto–Exponential approximation
por: Weinberg, G. V.
Publicado: (2016) -
Computation of Kullback–Leibler Divergence in Bayesian Networks
por: Moral, Serafín, et al.
Publicado: (2021) -
Kullback Leibler divergence in complete bacterial and phage genomes
por: Akhter, Sajia, et al.
Publicado: (2017) -
Kullback–Leibler Divergence of a Freely Cooling Granular Gas
por: Megías, Alberto, et al.
Publicado: (2020) -
A data assimilation framework that uses the Kullback-Leibler divergence
por: Pimentel, Sam, et al.
Publicado: (2021)