Cargando…
Measuring Independence between Statistical Randomness Tests by Mutual Information
The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure similar characteristics, and thus minimize the amount o...
Autores principales: | Karell-Albo, Jorge Augusto, Legón-Pérez , Carlos Miguel, Madarro-Capó , Evaristo José, Rojas, Omar, Sosa-Gómez, Guillermo |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7517289/ https://www.ncbi.nlm.nih.gov/pubmed/33286513 http://dx.doi.org/10.3390/e22070741 |
Ejemplares similares
-
Complexity Reduction in Analyzing Independence between Statistical Randomness Tests Using Mutual Information
por: Karell-Albo, Jorge Augusto, et al.
Publicado: (2023) -
Information Theory Based Evaluation of the RC4 Stream Cipher Outputs
por: Madarro-Capó , Evaristo José, et al.
Publicado: (2021) -
Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy
por: Contreras Rodríguez, Lianet, et al.
Publicado: (2021) -
Detection of DIAG and LINE Patterns in PassPoints Graphical Passwords Based on the Maximum Angles of Their Delaunay Triangles
por: Suárez-Plasencia, Lisset, et al.
Publicado: (2022) -
A Conditional Mutual Information Estimator for Mixed Data and an Associated Conditional Independence Test
por: Zan, Lei, et al.
Publicado: (2022)