Cargando…

Minimising the Kullback–Leibler Divergence for Model Selection in Distributed Nonlinear Systems

The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theor...

Descripción completa

Detalles Bibliográficos
Autores principales: Cliff, Oliver M., Prokopenko, Mikhail, Fitch, Robert
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512642/
https://www.ncbi.nlm.nih.gov/pubmed/33265171
http://dx.doi.org/10.3390/e20020051

Ejemplares similares