Cargando…

Metrical Approach to Measuring Uncertainty

Many uncertainty measures can be generated by the corresponding divergences, like the Kullback-Leibler divergence generates the Shannon entropy. Divergences can evaluate the information gain obtained by knowing a posterior probability distribution w.r.t. a prior one, or the contradiction between the...

Descripción completa

Detalles Bibliográficos
Autores principales: Bronevich, Andrey G., Rozenberg, Igor N.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7274748/
http://dx.doi.org/10.1007/978-3-030-50143-3_10
Descripción
Sumario:Many uncertainty measures can be generated by the corresponding divergences, like the Kullback-Leibler divergence generates the Shannon entropy. Divergences can evaluate the information gain obtained by knowing a posterior probability distribution w.r.t. a prior one, or the contradiction between them. Divergences can be also viewed as distances between probability distributions. In this paper, we consider divergences that satisfy a weak system of axioms. This system of axioms does not guaranty additivity of divergences and allows us to consider, for example, the [Formula: see text]-metric on probability measures as a divergence. We show what kind of uncertainty measures can be generated by such divergences, and how these uncertainty measures can be extended to credal sets.