Cargando…

On Generalized Schürmann Entropy Estimators

We present a new class of estimators of Shannon entropy for severely undersampled discrete distributions. It is based on a generalization of an estimator proposed by T. Schürmann, which itself is a generalization of an estimator proposed by myself.For a special set of parameters, they are completely...

Descripción completa

Detalles Bibliográficos
Autor principal: Grassberger, Peter
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9141067/
https://www.ncbi.nlm.nih.gov/pubmed/35626564
http://dx.doi.org/10.3390/e24050680
_version_ 1784715254518775808
author Grassberger, Peter
author_facet Grassberger, Peter
author_sort Grassberger, Peter
collection PubMed
description We present a new class of estimators of Shannon entropy for severely undersampled discrete distributions. It is based on a generalization of an estimator proposed by T. Schürmann, which itself is a generalization of an estimator proposed by myself.For a special set of parameters, they are completely free of bias and have a finite variance, something which is widely believed to be impossible. We present also detailed numerical tests, where we compare them with other recent estimators and with exact results, and point out a clash with Bayesian estimators for mutual information.
format Online
Article
Text
id pubmed-9141067
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-91410672022-05-28 On Generalized Schürmann Entropy Estimators Grassberger, Peter Entropy (Basel) Brief Report We present a new class of estimators of Shannon entropy for severely undersampled discrete distributions. It is based on a generalization of an estimator proposed by T. Schürmann, which itself is a generalization of an estimator proposed by myself.For a special set of parameters, they are completely free of bias and have a finite variance, something which is widely believed to be impossible. We present also detailed numerical tests, where we compare them with other recent estimators and with exact results, and point out a clash with Bayesian estimators for mutual information. MDPI 2022-05-11 /pmc/articles/PMC9141067/ /pubmed/35626564 http://dx.doi.org/10.3390/e24050680 Text en © 2022 by the author. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Brief Report
Grassberger, Peter
On Generalized Schürmann Entropy Estimators
title On Generalized Schürmann Entropy Estimators
title_full On Generalized Schürmann Entropy Estimators
title_fullStr On Generalized Schürmann Entropy Estimators
title_full_unstemmed On Generalized Schürmann Entropy Estimators
title_short On Generalized Schürmann Entropy Estimators
title_sort on generalized schürmann entropy estimators
topic Brief Report
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9141067/
https://www.ncbi.nlm.nih.gov/pubmed/35626564
http://dx.doi.org/10.3390/e24050680
work_keys_str_mv AT grassbergerpeter ongeneralizedschurmannentropyestimators