Cargando…
Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence
Cross entropy and Kullback–Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entropy is appli...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512543/ https://www.ncbi.nlm.nih.gov/pubmed/33266683 http://dx.doi.org/10.3390/e20120959 |