Cargando…

Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy

Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Jialin, Shi, Jingyi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9141039/
https://www.ncbi.nlm.nih.gov/pubmed/35626567
http://dx.doi.org/10.3390/e24050683
_version_ 1784715247178743808
author Zhang, Jialin
Shi, Jingyi
author_facet Zhang, Jialin
Shi, Jingyi
author_sort Zhang, Jialin
collection PubMed
description Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon’s entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon’s entropy. The asymptotic distribution for Shannon’s entropy’s plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon’s entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon’s entropy.
format Online
Article
Text
id pubmed-9141039
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-91410392022-05-28 Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy Zhang, Jialin Shi, Jingyi Entropy (Basel) Article Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon’s entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon’s entropy. The asymptotic distribution for Shannon’s entropy’s plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon’s entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon’s entropy. MDPI 2022-05-12 /pmc/articles/PMC9141039/ /pubmed/35626567 http://dx.doi.org/10.3390/e24050683 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zhang, Jialin
Shi, Jingyi
Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy
title Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy
title_full Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy
title_fullStr Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy
title_full_unstemmed Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy
title_short Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy
title_sort asymptotic normality for plug-in estimators of generalized shannon’s entropy
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9141039/
https://www.ncbi.nlm.nih.gov/pubmed/35626567
http://dx.doi.org/10.3390/e24050683
work_keys_str_mv AT zhangjialin asymptoticnormalityforpluginestimatorsofgeneralizedshannonsentropy
AT shijingyi asymptoticnormalityforpluginestimatorsofgeneralizedshannonsentropy