Cargando…
First Digits’ Shannon Entropy
Related to the letters of an alphabet, entropy means the average number of binary digits required for the transmission of one character. Checking tables of statistical data, one finds that, in the first position of the numbers, the digits 1 to 9 occur with different frequencies. Correspondingly, fro...
Autor principal: | Kreiner, Welf Alfred |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9601575/ https://www.ncbi.nlm.nih.gov/pubmed/37420433 http://dx.doi.org/10.3390/e24101413 |
Ejemplares similares
-
Application of Positional Entropy to Fast Shannon Entropy Estimation for Samples of Digital Signals
por: Cholewa, Marcin, et al.
Publicado: (2020) -
Controlling the Shannon Entropy of Quantum Systems
por: Xing, Yifan, et al.
Publicado: (2013) -
Universality and Shannon entropy of codon usage
por: Frappat, L, et al.
Publicado: (2003) -
Shannon Entropy: An Econophysical Approach to Cryptocurrency Portfolios
por: Rodriguez-Rodriguez, Noé, et al.
Publicado: (2022) -
New Estimations for Shannon and Zipf–Mandelbrot Entropies
por: Adil Khan, Muhammad, et al.
Publicado: (2018)