Cargando…
Revisiting Chernoff Information with Likelihood Ratio Exponential Families
The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the di...
Autor principal: | Nielsen, Frank |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9601539/ https://www.ncbi.nlm.nih.gov/pubmed/37420420 http://dx.doi.org/10.3390/e24101400 |
Ejemplares similares
-
Recent advances in statistics: papers in honor of Herman Chernoff on his sixtieth birthday
por: Rizvi, M Haseeb, et al.
Publicado: (1983) -
Information and exponential families in statistical theory
por: Barndorff-Nielsen, O
Publicado: (2014) -
Microarray background correction: maximum likelihood estimation for the normal–exponential convolution
por: Silver, Jeremy D., et al.
Publicado: (2009) -
Information Geometric Duality of ϕ-Deformed Exponential Families
por: Korbel, Jan, et al.
Publicado: (2019) -
Inference under unequal probability sampling with the Bayesian exponentially tilted empirical likelihood
por: Yiu, A., et al.
Publicado: (2020)