Cargando…
Minimising the Kullback–Leibler Divergence for Model Selection in Distributed Nonlinear Systems
The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theor...
Autores principales: | Cliff, Oliver M., Prokopenko, Mikhail, Fitch, Robert |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512642/ https://www.ncbi.nlm.nih.gov/pubmed/33265171 http://dx.doi.org/10.3390/e20020051 |
Ejemplares similares
-
Kullback–Leibler divergence and the Pareto–Exponential approximation
por: Weinberg, G. V.
Publicado: (2016) -
Computation of Kullback–Leibler Divergence in Bayesian Networks
por: Moral, Serafín, et al.
Publicado: (2021) -
Kullback Leibler divergence in complete bacterial and phage genomes
por: Akhter, Sajia, et al.
Publicado: (2017) -
Kullback–Leibler Divergence of a Freely Cooling Granular Gas
por: Megías, Alberto, et al.
Publicado: (2020) -
A data assimilation framework that uses the Kullback-Leibler divergence
por: Pimentel, Sam, et al.
Publicado: (2021)