Cargando…

The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex

The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribut...

Descripción completa

Detalles Bibliográficos
Autores principales: Lad, Frank, Sanfilippo, Giuseppe, Agrò, Gianna
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7513105/
https://www.ncbi.nlm.nih.gov/pubmed/33265682
http://dx.doi.org/10.3390/e20080593
_version_ 1783586311394820096
author Lad, Frank
Sanfilippo, Giuseppe
Agrò, Gianna
author_facet Lad, Frank
Sanfilippo, Giuseppe
Agrò, Gianna
author_sort Lad, Frank
collection PubMed
description The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as “extropy”. We report here the main results that identify this fact, specifying the dual equations and exhibiting some of their structure. The duality extends beyond a simple assessment of entropy, to the formulation of relative entropy and the Kullback symmetric distance between two forecasting distributions. This is defined by the sum of a pair of directed divergences. Examining the defining equation, we notice that this symmetric measure can be generated by two other explicable pairs of functions as well, neither of which is a Bregman divergence. The Kullback information complex is constituted by the symmetric measure of entropy/extropy along with one of each of these three function pairs. It is intimately related to the total logarithmic score of two distinct forecasting distributions for a quantity under consideration, this being a complete proper score. The information complex is isomorphic to the expectations that the two forecasting distributions assess for their achieved scores, each for its own score and for the score achieved by the other. Analysis of the scoring problem exposes a Pareto optimal exchange of the forecasters’ scores that both are willing to engage. Both would support its evaluation for assessing the relative quality of the information they provide regarding the observation of an unknown quantity of interest. We present our results without proofs, as these appear in source articles that are referenced. The focus here is on their content, unhindered. The mathematical syntax of probability we employ relies upon the operational subjective constructions of Bruno de Finetti.
format Online
Article
Text
id pubmed-7513105
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75131052020-11-09 The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex Lad, Frank Sanfilippo, Giuseppe Agrò, Gianna Entropy (Basel) Article The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as “extropy”. We report here the main results that identify this fact, specifying the dual equations and exhibiting some of their structure. The duality extends beyond a simple assessment of entropy, to the formulation of relative entropy and the Kullback symmetric distance between two forecasting distributions. This is defined by the sum of a pair of directed divergences. Examining the defining equation, we notice that this symmetric measure can be generated by two other explicable pairs of functions as well, neither of which is a Bregman divergence. The Kullback information complex is constituted by the symmetric measure of entropy/extropy along with one of each of these three function pairs. It is intimately related to the total logarithmic score of two distinct forecasting distributions for a quantity under consideration, this being a complete proper score. The information complex is isomorphic to the expectations that the two forecasting distributions assess for their achieved scores, each for its own score and for the score achieved by the other. Analysis of the scoring problem exposes a Pareto optimal exchange of the forecasters’ scores that both are willing to engage. Both would support its evaluation for assessing the relative quality of the information they provide regarding the observation of an unknown quantity of interest. We present our results without proofs, as these appear in source articles that are referenced. The focus here is on their content, unhindered. The mathematical syntax of probability we employ relies upon the operational subjective constructions of Bruno de Finetti. MDPI 2018-08-09 /pmc/articles/PMC7513105/ /pubmed/33265682 http://dx.doi.org/10.3390/e20080593 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Lad, Frank
Sanfilippo, Giuseppe
Agrò, Gianna
The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
title The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
title_full The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
title_fullStr The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
title_full_unstemmed The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
title_short The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex
title_sort duality of entropy/extropy, and completion of the kullback information complex
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7513105/
https://www.ncbi.nlm.nih.gov/pubmed/33265682
http://dx.doi.org/10.3390/e20080593
work_keys_str_mv AT ladfrank thedualityofentropyextropyandcompletionofthekullbackinformationcomplex
AT sanfilippogiuseppe thedualityofentropyextropyandcompletionofthekullbackinformationcomplex
AT agrogianna thedualityofentropyextropyandcompletionofthekullbackinformationcomplex
AT ladfrank dualityofentropyextropyandcompletionofthekullbackinformationcomplex
AT sanfilippogiuseppe dualityofentropyextropyandcompletionofthekullbackinformationcomplex
AT agrogianna dualityofentropyextropyandcompletionofthekullbackinformationcomplex