Cargando…

Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers

Pseudo-random number generators (PRNGs) are software algorithms generating a sequence of numbers approximating the properties of random numbers. They are critical components in many information systems that require unpredictable and nonarbitrary behaviors, such as parameter configuration in machine...

Descripción completa

Detalles Bibliográficos
Autores principales: Okada, Kiyoshiro, Endo, Katsuhiro, Yasuoka, Kenji, Kurabayashi, Shuichi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10266608/
https://www.ncbi.nlm.nih.gov/pubmed/37315028
http://dx.doi.org/10.1371/journal.pone.0287025
_version_ 1785058773950267392
author Okada, Kiyoshiro
Endo, Katsuhiro
Yasuoka, Kenji
Kurabayashi, Shuichi
author_facet Okada, Kiyoshiro
Endo, Katsuhiro
Yasuoka, Kenji
Kurabayashi, Shuichi
author_sort Okada, Kiyoshiro
collection PubMed
description Pseudo-random number generators (PRNGs) are software algorithms generating a sequence of numbers approximating the properties of random numbers. They are critical components in many information systems that require unpredictable and nonarbitrary behaviors, such as parameter configuration in machine learning, gaming, cryptography, and simulation. A PRNG is commonly validated through a statistical test suite, such as NIST SP 800-22rev1a (NIST test suite), to evaluate its robustness and the randomness of the numbers. In this paper, we propose a Wasserstein distance-based generative adversarial network (WGAN) approach to generating PRNGs that fully satisfy the NIST test suite. In this approach, the existing Mersenne Twister (MT) PRNG is learned without implementing any mathematical programming code. We remove the dropout layers from the conventional WGAN network to learn random numbers distributed in the entire feature space because the nearly infinite amount of data can suppress the overfitting problems that occur without dropout layers. We conduct experimental studies to evaluate our learned pseudo-random number generator (LPRNG) by adopting cosine-function-based numbers with poor random number properties according to the NIST test suite as seed numbers. The experimental results show that our LPRNG successfully converted the sequence of seed numbers to random numbers that fully satisfy the NIST test suite. This study opens the way for the “democratization” of PRNGs through the end-to-end learning of conventional PRNGs, which means that PRNGs can be generated without deep mathematical know-how. Such tailor-made PRNGs will effectively enhance the unpredictability and nonarbitrariness of a wide range of information systems, even if the seed numbers can be revealed by reverse engineering. The experimental results also show that overfitting was observed after about 450,000 trials of learning, suggesting that there is an upper limit to the number of learning counts for a fixed-size neural network, even when learning with unlimited data.
format Online
Article
Text
id pubmed-10266608
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-102666082023-06-15 Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers Okada, Kiyoshiro Endo, Katsuhiro Yasuoka, Kenji Kurabayashi, Shuichi PLoS One Research Article Pseudo-random number generators (PRNGs) are software algorithms generating a sequence of numbers approximating the properties of random numbers. They are critical components in many information systems that require unpredictable and nonarbitrary behaviors, such as parameter configuration in machine learning, gaming, cryptography, and simulation. A PRNG is commonly validated through a statistical test suite, such as NIST SP 800-22rev1a (NIST test suite), to evaluate its robustness and the randomness of the numbers. In this paper, we propose a Wasserstein distance-based generative adversarial network (WGAN) approach to generating PRNGs that fully satisfy the NIST test suite. In this approach, the existing Mersenne Twister (MT) PRNG is learned without implementing any mathematical programming code. We remove the dropout layers from the conventional WGAN network to learn random numbers distributed in the entire feature space because the nearly infinite amount of data can suppress the overfitting problems that occur without dropout layers. We conduct experimental studies to evaluate our learned pseudo-random number generator (LPRNG) by adopting cosine-function-based numbers with poor random number properties according to the NIST test suite as seed numbers. The experimental results show that our LPRNG successfully converted the sequence of seed numbers to random numbers that fully satisfy the NIST test suite. This study opens the way for the “democratization” of PRNGs through the end-to-end learning of conventional PRNGs, which means that PRNGs can be generated without deep mathematical know-how. Such tailor-made PRNGs will effectively enhance the unpredictability and nonarbitrariness of a wide range of information systems, even if the seed numbers can be revealed by reverse engineering. The experimental results also show that overfitting was observed after about 450,000 trials of learning, suggesting that there is an upper limit to the number of learning counts for a fixed-size neural network, even when learning with unlimited data. Public Library of Science 2023-06-14 /pmc/articles/PMC10266608/ /pubmed/37315028 http://dx.doi.org/10.1371/journal.pone.0287025 Text en © 2023 Okada et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Okada, Kiyoshiro
Endo, Katsuhiro
Yasuoka, Kenji
Kurabayashi, Shuichi
Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers
title Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers
title_full Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers
title_fullStr Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers
title_full_unstemmed Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers
title_short Learned pseudo-random number generator: WGAN-GP for generating statistically robust random numbers
title_sort learned pseudo-random number generator: wgan-gp for generating statistically robust random numbers
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10266608/
https://www.ncbi.nlm.nih.gov/pubmed/37315028
http://dx.doi.org/10.1371/journal.pone.0287025
work_keys_str_mv AT okadakiyoshiro learnedpseudorandomnumbergeneratorwgangpforgeneratingstatisticallyrobustrandomnumbers
AT endokatsuhiro learnedpseudorandomnumbergeneratorwgangpforgeneratingstatisticallyrobustrandomnumbers
AT yasuokakenji learnedpseudorandomnumbergeneratorwgangpforgeneratingstatisticallyrobustrandomnumbers
AT kurabayashishuichi learnedpseudorandomnumbergeneratorwgangpforgeneratingstatisticallyrobustrandomnumbers