Cargando…
Fast Approximations of the Jeffreys Divergence between Univariate Gaussian Mixtures via Mixture Conversions to Exponential-Polynomial Distributions
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–Leibler divergence broadly used in information sciences. Since the Jeffreys divergence between Gaussian mixture models is not available in closed-form, various techniques with advantages and disadvantages have bee...
Autor principal: | Nielsen, Frank |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8619509/ https://www.ncbi.nlm.nih.gov/pubmed/34828115 http://dx.doi.org/10.3390/e23111417 |
Ejemplares similares
-
Limit theorems of polynomial approximation with exponential weights
por: Ganzburg, Michael I
Publicado: (2008) -
Beyond SNP heritability: Polygenicity and discoverability of phenotypes estimated with a univariate Gaussian mixture model
por: Holland, Dominic, et al.
Publicado: (2020) -
A Fast Incremental Gaussian Mixture Model
por: Pinto, Rafael Coimbra, et al.
Publicado: (2015) -
Semi-Parametric Estimation Using Bernstein Polynomial and a Finite Gaussian Mixture Model
por: Helali, Salima, et al.
Publicado: (2022) -
Correction: A Fast Incremental Gaussian Mixture Model
por: Pinto, Rafael Coimbra, et al.
Publicado: (2015)