Cargando…

Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples

We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equal-width bins with varying probabili...

Descripción completa

Detalles Bibliográficos
Autores principales: Gupta, Hoshin V., Ehsani, Mohammad Reza, Roy, Tirthankar, Sans-Fuentes, Maria A., Ehret, Uwe, Behrangi, Ali
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8231182/
https://www.ncbi.nlm.nih.gov/pubmed/34208344
http://dx.doi.org/10.3390/e23060740
_version_ 1783713371768487936
author Gupta, Hoshin V.
Ehsani, Mohammad Reza
Roy, Tirthankar
Sans-Fuentes, Maria A.
Ehret, Uwe
Behrangi, Ali
author_facet Gupta, Hoshin V.
Ehsani, Mohammad Reza
Roy, Tirthankar
Sans-Fuentes, Maria A.
Ehret, Uwe
Behrangi, Ali
author_sort Gupta, Hoshin V.
collection PubMed
description We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equal-width bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (pdf) into equal-probability-mass intervals. And, whereas BC and KD each require optimal tuning of a hyper-parameter whose value varies with sample size and shape of the pdf, QS only requires specification of the number of quantiles to be used. Results indicate, for the class of distributions tested, that the optimal number of quantiles is a fixed fraction of the sample size (empirically determined to be [Formula: see text]), and that this value is relatively insensitive to distributional form or sample size. This provides a clear advantage over BC and KD since hyper-parameter tuning is not required. Further, unlike KD, there is no need to select an appropriate kernel-type, and so QS is applicable to pdfs of arbitrary shape, including those with discontinuous slope and/or magnitude. Bootstrapping is used to approximate the sampling variability distribution of the resulting entropy estimate, and is shown to accurately reflect the true uncertainty. For the four distributional forms studied (Gaussian, Log-Normal, Exponential and Bimodal Gaussian Mixture), expected estimation bias is less than 1% and uncertainty is low even for samples of as few as [Formula: see text] data points; in contrast, for KD the small sample bias can be as large as [Formula: see text] and for BC as large as [Formula: see text]. We speculate that estimating quantile locations, rather than bin-probabilities, results in more efficient use of the information in the data to approximate the underlying shape of an unknown data generating pdf.
format Online
Article
Text
id pubmed-8231182
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-82311822021-06-26 Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples Gupta, Hoshin V. Ehsani, Mohammad Reza Roy, Tirthankar Sans-Fuentes, Maria A. Ehret, Uwe Behrangi, Ali Entropy (Basel) Article We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equal-width bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (pdf) into equal-probability-mass intervals. And, whereas BC and KD each require optimal tuning of a hyper-parameter whose value varies with sample size and shape of the pdf, QS only requires specification of the number of quantiles to be used. Results indicate, for the class of distributions tested, that the optimal number of quantiles is a fixed fraction of the sample size (empirically determined to be [Formula: see text]), and that this value is relatively insensitive to distributional form or sample size. This provides a clear advantage over BC and KD since hyper-parameter tuning is not required. Further, unlike KD, there is no need to select an appropriate kernel-type, and so QS is applicable to pdfs of arbitrary shape, including those with discontinuous slope and/or magnitude. Bootstrapping is used to approximate the sampling variability distribution of the resulting entropy estimate, and is shown to accurately reflect the true uncertainty. For the four distributional forms studied (Gaussian, Log-Normal, Exponential and Bimodal Gaussian Mixture), expected estimation bias is less than 1% and uncertainty is low even for samples of as few as [Formula: see text] data points; in contrast, for KD the small sample bias can be as large as [Formula: see text] and for BC as large as [Formula: see text]. We speculate that estimating quantile locations, rather than bin-probabilities, results in more efficient use of the information in the data to approximate the underlying shape of an unknown data generating pdf. MDPI 2021-06-11 /pmc/articles/PMC8231182/ /pubmed/34208344 http://dx.doi.org/10.3390/e23060740 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Gupta, Hoshin V.
Ehsani, Mohammad Reza
Roy, Tirthankar
Sans-Fuentes, Maria A.
Ehret, Uwe
Behrangi, Ali
Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_full Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_fullStr Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_full_unstemmed Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_short Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_sort computing accurate probabilistic estimates of one-d entropy from equiprobable random samples
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8231182/
https://www.ncbi.nlm.nih.gov/pubmed/34208344
http://dx.doi.org/10.3390/e23060740
work_keys_str_mv AT guptahoshinv computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT ehsanimohammadreza computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT roytirthankar computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT sansfuentesmariaa computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT ehretuwe computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT behrangiali computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples