Cargando…
Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studying and adopting some recent convergence results of the entropy functional, which is known to be a discontinuous function in the space of probabilities in ∞-alphabets. Sufficient conditions for the con...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512916/ https://www.ncbi.nlm.nih.gov/pubmed/33265487 http://dx.doi.org/10.3390/e20060397 |
_version_ | 1783586267683880960 |
---|---|
author | Silva, Jorge F. |
author_facet | Silva, Jorge F. |
author_sort | Silva, Jorge F. |
collection | PubMed |
description | This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studying and adopting some recent convergence results of the entropy functional, which is known to be a discontinuous function in the space of probabilities in ∞-alphabets. Sufficient conditions for the convergence of the entropy are used in conjunction with some deviation inequalities (including scenarios with both finitely and infinitely supported assumptions on the target distribution). From this perspective, four plug-in histogram-based estimators are studied showing that convergence results are instrumental to derive new strong consistent estimators for the entropy. The main application of this methodology is a new data-driven partition (plug-in) estimator. This scheme uses the data to restrict the support where the distribution is estimated by finding an optimal balance between estimation and approximation errors. The proposed scheme offers a consistent (distribution-free) estimator of the entropy in ∞-alphabets and optimal rates of convergence under certain regularity conditions on the problem (finite and unknown supported assumptions and tail bounded conditions on the target distribution). |
format | Online Article Text |
id | pubmed-7512916 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-75129162020-11-09 Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators Silva, Jorge F. Entropy (Basel) Article This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studying and adopting some recent convergence results of the entropy functional, which is known to be a discontinuous function in the space of probabilities in ∞-alphabets. Sufficient conditions for the convergence of the entropy are used in conjunction with some deviation inequalities (including scenarios with both finitely and infinitely supported assumptions on the target distribution). From this perspective, four plug-in histogram-based estimators are studied showing that convergence results are instrumental to derive new strong consistent estimators for the entropy. The main application of this methodology is a new data-driven partition (plug-in) estimator. This scheme uses the data to restrict the support where the distribution is estimated by finding an optimal balance between estimation and approximation errors. The proposed scheme offers a consistent (distribution-free) estimator of the entropy in ∞-alphabets and optimal rates of convergence under certain regularity conditions on the problem (finite and unknown supported assumptions and tail bounded conditions on the target distribution). MDPI 2018-05-23 /pmc/articles/PMC7512916/ /pubmed/33265487 http://dx.doi.org/10.3390/e20060397 Text en © 2018 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Silva, Jorge F. Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators |
title | Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators |
title_full | Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators |
title_fullStr | Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators |
title_full_unstemmed | Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators |
title_short | Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators |
title_sort | shannon entropy estimation in ∞-alphabets from convergence results: studying plug-in estimators |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512916/ https://www.ncbi.nlm.nih.gov/pubmed/33265487 http://dx.doi.org/10.3390/e20060397 |
work_keys_str_mv | AT silvajorgef shannonentropyestimationinalphabetsfromconvergenceresultsstudyingpluginestimators |