Cargando…

Computation of Kullback–Leibler Divergence in Bayesian Networks

Kullback–Leibler divergence [Formula: see text] is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when lear...

Descripción completa

Detalles Bibliográficos
Autores principales: Moral, Serafín, Cano, Andrés, Gómez-Olmedo, Manuel
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8466032/
https://www.ncbi.nlm.nih.gov/pubmed/34573747
http://dx.doi.org/10.3390/e23091122
_version_ 1784573028058791936
author Moral, Serafín
Cano, Andrés
Gómez-Olmedo, Manuel
author_facet Moral, Serafín
Cano, Andrés
Gómez-Olmedo, Manuel
author_sort Moral, Serafín
collection PubMed
description Kullback–Leibler divergence [Formula: see text] is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when learning a probability. In high dimensional probabilities, as the ones associated with Bayesian networks, a direct computation can be unfeasible. This paper considers the case of efficiently computing the Kullback–Leibler divergence of two probability distributions, each one of them coming from a different Bayesian network, which might have different structures. The paper is based on an auxiliary deletion algorithm to compute the necessary marginal distributions, but using a cache of operations with potentials in order to reuse past computations whenever they are necessary. The algorithms are tested with Bayesian networks from the bnlearn repository. Computer code in Python is provided taking as basis pgmpy, a library for working with probabilistic graphical models.
format Online
Article
Text
id pubmed-8466032
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-84660322021-09-27 Computation of Kullback–Leibler Divergence in Bayesian Networks Moral, Serafín Cano, Andrés Gómez-Olmedo, Manuel Entropy (Basel) Article Kullback–Leibler divergence [Formula: see text] is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when learning a probability. In high dimensional probabilities, as the ones associated with Bayesian networks, a direct computation can be unfeasible. This paper considers the case of efficiently computing the Kullback–Leibler divergence of two probability distributions, each one of them coming from a different Bayesian network, which might have different structures. The paper is based on an auxiliary deletion algorithm to compute the necessary marginal distributions, but using a cache of operations with potentials in order to reuse past computations whenever they are necessary. The algorithms are tested with Bayesian networks from the bnlearn repository. Computer code in Python is provided taking as basis pgmpy, a library for working with probabilistic graphical models. MDPI 2021-08-28 /pmc/articles/PMC8466032/ /pubmed/34573747 http://dx.doi.org/10.3390/e23091122 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Moral, Serafín
Cano, Andrés
Gómez-Olmedo, Manuel
Computation of Kullback–Leibler Divergence in Bayesian Networks
title Computation of Kullback–Leibler Divergence in Bayesian Networks
title_full Computation of Kullback–Leibler Divergence in Bayesian Networks
title_fullStr Computation of Kullback–Leibler Divergence in Bayesian Networks
title_full_unstemmed Computation of Kullback–Leibler Divergence in Bayesian Networks
title_short Computation of Kullback–Leibler Divergence in Bayesian Networks
title_sort computation of kullback–leibler divergence in bayesian networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8466032/
https://www.ncbi.nlm.nih.gov/pubmed/34573747
http://dx.doi.org/10.3390/e23091122
work_keys_str_mv AT moralserafin computationofkullbackleiblerdivergenceinbayesiannetworks
AT canoandres computationofkullbackleiblerdivergenceinbayesiannetworks
AT gomezolmedomanuel computationofkullbackleiblerdivergenceinbayesiannetworks