Cargando…

Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints

The Partial Information Decomposition, introduced by Williams P. L. et al. (2010), provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method ([Formula: see text]) has recently been proposed by James R. G. et al. (2017) for computing...

Descripción completa

Detalles Bibliográficos
Autores principales: Kay, Jim W., Ince, Robin A. A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512755/
https://www.ncbi.nlm.nih.gov/pubmed/33265331
http://dx.doi.org/10.3390/e20040240
_version_ 1783586231212310528
author Kay, Jim W.
Ince, Robin A. A.
author_facet Kay, Jim W.
Ince, Robin A. A.
author_sort Kay, Jim W.
collection PubMed
description The Partial Information Decomposition, introduced by Williams P. L. et al. (2010), provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method ([Formula: see text]) has recently been proposed by James R. G. et al. (2017) for computing a two-predictor partial information decomposition over discrete spaces. A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a particular predictor has about the target is defined as the minimum increase in joint predictor-target mutual information when that particular predictor-target marginal dependency is constrained. Here, we apply the [Formula: see text] approach to Gaussian systems, for which the marginally constrained maximum entropy models are Gaussian graphical models. Closed form solutions for the [Formula: see text] PID are derived for both univariate and multivariate Gaussian systems. Numerical and graphical illustrations are provided, together with practical and theoretical comparisons of the [Formula: see text] PID with the minimum mutual information partial information decomposition ([Formula: see text]), which was discussed by Barrett A. B. (2015). The results obtained using [Formula: see text] appear to be more intuitive than those given with other methods, such as [Formula: see text] , in which the redundant and unique information components are constrained to depend only on the predictor-target marginal distributions. In particular, it is proved that the [Formula: see text] method generally produces larger estimates of redundancy and synergy than does the [Formula: see text] method. In discussion of the practical examples, the PIDs are complemented by the use of tests of deviance for the comparison of Gaussian graphical models.
format Online
Article
Text
id pubmed-7512755
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75127552020-11-09 Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints Kay, Jim W. Ince, Robin A. A. Entropy (Basel) Article The Partial Information Decomposition, introduced by Williams P. L. et al. (2010), provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method ([Formula: see text]) has recently been proposed by James R. G. et al. (2017) for computing a two-predictor partial information decomposition over discrete spaces. A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a particular predictor has about the target is defined as the minimum increase in joint predictor-target mutual information when that particular predictor-target marginal dependency is constrained. Here, we apply the [Formula: see text] approach to Gaussian systems, for which the marginally constrained maximum entropy models are Gaussian graphical models. Closed form solutions for the [Formula: see text] PID are derived for both univariate and multivariate Gaussian systems. Numerical and graphical illustrations are provided, together with practical and theoretical comparisons of the [Formula: see text] PID with the minimum mutual information partial information decomposition ([Formula: see text]), which was discussed by Barrett A. B. (2015). The results obtained using [Formula: see text] appear to be more intuitive than those given with other methods, such as [Formula: see text] , in which the redundant and unique information components are constrained to depend only on the predictor-target marginal distributions. In particular, it is proved that the [Formula: see text] method generally produces larger estimates of redundancy and synergy than does the [Formula: see text] method. In discussion of the practical examples, the PIDs are complemented by the use of tests of deviance for the comparison of Gaussian graphical models. MDPI 2018-03-30 /pmc/articles/PMC7512755/ /pubmed/33265331 http://dx.doi.org/10.3390/e20040240 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Kay, Jim W.
Ince, Robin A. A.
Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints
title Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints
title_full Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints
title_fullStr Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints
title_full_unstemmed Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints
title_short Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints
title_sort exact partial information decompositions for gaussian systems based on dependency constraints
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512755/
https://www.ncbi.nlm.nih.gov/pubmed/33265331
http://dx.doi.org/10.3390/e20040240
work_keys_str_mv AT kayjimw exactpartialinformationdecompositionsforgaussiansystemsbasedondependencyconstraints
AT incerobinaa exactpartialinformationdecompositionsforgaussiansystemsbasedondependencyconstraints