Cargando…

Two Measures of Dependence

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order [Formula: see text] and the relative [Formula: see text]-entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order [Formula:...

Descripción completa

Detalles Bibliográficos
Autores principales: Lapidoth, Amos, Pfister, Christoph
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515307/
https://www.ncbi.nlm.nih.gov/pubmed/33267491
http://dx.doi.org/10.3390/e21080778
_version_ 1783586787121168384
author Lapidoth, Amos
Pfister, Christoph
author_facet Lapidoth, Amos
Pfister, Christoph
author_sort Lapidoth, Amos
collection PubMed
description Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order [Formula: see text] and the relative [Formula: see text]-entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order [Formula: see text] is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.
format Online
Article
Text
id pubmed-7515307
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75153072020-11-09 Two Measures of Dependence Lapidoth, Amos Pfister, Christoph Entropy (Basel) Article Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order [Formula: see text] and the relative [Formula: see text]-entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order [Formula: see text] is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding. MDPI 2019-08-08 /pmc/articles/PMC7515307/ /pubmed/33267491 http://dx.doi.org/10.3390/e21080778 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Lapidoth, Amos
Pfister, Christoph
Two Measures of Dependence
title Two Measures of Dependence
title_full Two Measures of Dependence
title_fullStr Two Measures of Dependence
title_full_unstemmed Two Measures of Dependence
title_short Two Measures of Dependence
title_sort two measures of dependence
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515307/
https://www.ncbi.nlm.nih.gov/pubmed/33267491
http://dx.doi.org/10.3390/e21080778
work_keys_str_mv AT lapidothamos twomeasuresofdependence
AT pfisterchristoph twomeasuresofdependence