Cargando…
On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid
The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar [Formula: see text]-Jensen–Bregman divergences and derive...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516653/ https://www.ncbi.nlm.nih.gov/pubmed/33285995 http://dx.doi.org/10.3390/e22020221 |
Sumario: | The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar [Formula: see text]-Jensen–Bregman divergences and derive thereof the vector-skew [Formula: see text]-Jensen–Shannon divergences. We prove that the vector-skew [Formula: see text]-Jensen–Shannon divergences are f-divergences and study the properties of these novel divergences. Finally, we report an iterative algorithm to numerically compute the Jensen–Shannon-type centroids for a set of probability densities belonging to a mixture family: This includes the case of the Jensen–Shannon centroid of a set of categorical distributions or normalized histograms. |
---|