Cargando…
An Axiomatic Characterization of Mutual Information
We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev’s characterization of the Shannon entropy. There is a new axiom in our characterization, however, which has no analog for Shannon entropy, based o...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10137661/ https://www.ncbi.nlm.nih.gov/pubmed/37190451 http://dx.doi.org/10.3390/e25040663 |
_version_ | 1785032520109129728 |
---|---|
author | Fullwood, James |
author_facet | Fullwood, James |
author_sort | Fullwood, James |
collection | PubMed |
description | We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev’s characterization of the Shannon entropy. There is a new axiom in our characterization, however, which has no analog for Shannon entropy, based on the notion of a Markov triangle, which may be thought of as a composition of communication channels for which conditional entropy acts functorially. Our proofs are coordinate-free in the sense that no logarithms appear in our calculations. |
format | Online Article Text |
id | pubmed-10137661 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-101376612023-04-28 An Axiomatic Characterization of Mutual Information Fullwood, James Entropy (Basel) Article We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev’s characterization of the Shannon entropy. There is a new axiom in our characterization, however, which has no analog for Shannon entropy, based on the notion of a Markov triangle, which may be thought of as a composition of communication channels for which conditional entropy acts functorially. Our proofs are coordinate-free in the sense that no logarithms appear in our calculations. MDPI 2023-04-15 /pmc/articles/PMC10137661/ /pubmed/37190451 http://dx.doi.org/10.3390/e25040663 Text en © 2023 by the author. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Fullwood, James An Axiomatic Characterization of Mutual Information |
title | An Axiomatic Characterization of Mutual Information |
title_full | An Axiomatic Characterization of Mutual Information |
title_fullStr | An Axiomatic Characterization of Mutual Information |
title_full_unstemmed | An Axiomatic Characterization of Mutual Information |
title_short | An Axiomatic Characterization of Mutual Information |
title_sort | axiomatic characterization of mutual information |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10137661/ https://www.ncbi.nlm.nih.gov/pubmed/37190451 http://dx.doi.org/10.3390/e25040663 |
work_keys_str_mv | AT fullwoodjames anaxiomaticcharacterizationofmutualinformation AT fullwoodjames axiomaticcharacterizationofmutualinformation |