Cargando…
Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy
Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the...
Autores principales: | Zhang, Jialin, Shi, Jingyi |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9141039/ https://www.ncbi.nlm.nih.gov/pubmed/35626567 http://dx.doi.org/10.3390/e24050683 |
Ejemplares similares
-
Shannon Entropy Estimation in ∞-Alphabets from Convergence Results: Studying Plug-In Estimators
por: Silva, Jorge F.
Publicado: (2018) -
New Estimations for Shannon and Zipf–Mandelbrot Entropies
por: Adil Khan, Muhammad, et al.
Publicado: (2018) -
Alternative Entropy Measures and Generalized Khinchin–Shannon Inequalities
por: Mondaini, Rubem P., et al.
Publicado: (2021) -
A Review of Shannon and Differential Entropy Rate Estimation
por: Feutrill, Andrew, et al.
Publicado: (2021) -
Application of Positional Entropy to Fast Shannon Entropy Estimation for Samples of Digital Signals
por: Cholewa, Marcin, et al.
Publicado: (2020)