Cargando…
Stochastic gradient descent for optimization for nuclear systems
The use of gradient descent methods for optimizing k-eigenvalue nuclear systems has been shown to be useful in the past, but the use of k-eigenvalue gradients have proved computationally challenging due to their stochastic nature. ADAM is a gradient descent method that accounts for gradients with a...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10213052/ https://www.ncbi.nlm.nih.gov/pubmed/37230990 http://dx.doi.org/10.1038/s41598-023-32112-7 |
_version_ | 1785047545351766016 |
---|---|
author | Williams, Austin Walton, Noah Maryanski, Austin Bogetic, Sandra Hines, Wes Sobes, Vladimir |
author_facet | Williams, Austin Walton, Noah Maryanski, Austin Bogetic, Sandra Hines, Wes Sobes, Vladimir |
author_sort | Williams, Austin |
collection | PubMed |
description | The use of gradient descent methods for optimizing k-eigenvalue nuclear systems has been shown to be useful in the past, but the use of k-eigenvalue gradients have proved computationally challenging due to their stochastic nature. ADAM is a gradient descent method that accounts for gradients with a stochastic nature. This analysis uses challenge problems constructed to verify if ADAM is a suitable tool to optimize k-eigenvalue nuclear systems. ADAM is able to successfully optimize nuclear systems using the gradients of k-eigenvalue problems despite their stochastic nature and uncertainty. Furthermore, it is clearly demonstrated that low-compute time, high-variance estimates of the gradient lead to better performance in the optimization challenge problems tested here. |
format | Online Article Text |
id | pubmed-10213052 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-102130522023-05-27 Stochastic gradient descent for optimization for nuclear systems Williams, Austin Walton, Noah Maryanski, Austin Bogetic, Sandra Hines, Wes Sobes, Vladimir Sci Rep Article The use of gradient descent methods for optimizing k-eigenvalue nuclear systems has been shown to be useful in the past, but the use of k-eigenvalue gradients have proved computationally challenging due to their stochastic nature. ADAM is a gradient descent method that accounts for gradients with a stochastic nature. This analysis uses challenge problems constructed to verify if ADAM is a suitable tool to optimize k-eigenvalue nuclear systems. ADAM is able to successfully optimize nuclear systems using the gradients of k-eigenvalue problems despite their stochastic nature and uncertainty. Furthermore, it is clearly demonstrated that low-compute time, high-variance estimates of the gradient lead to better performance in the optimization challenge problems tested here. Nature Publishing Group UK 2023-05-25 /pmc/articles/PMC10213052/ /pubmed/37230990 http://dx.doi.org/10.1038/s41598-023-32112-7 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Williams, Austin Walton, Noah Maryanski, Austin Bogetic, Sandra Hines, Wes Sobes, Vladimir Stochastic gradient descent for optimization for nuclear systems |
title | Stochastic gradient descent for optimization for nuclear systems |
title_full | Stochastic gradient descent for optimization for nuclear systems |
title_fullStr | Stochastic gradient descent for optimization for nuclear systems |
title_full_unstemmed | Stochastic gradient descent for optimization for nuclear systems |
title_short | Stochastic gradient descent for optimization for nuclear systems |
title_sort | stochastic gradient descent for optimization for nuclear systems |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10213052/ https://www.ncbi.nlm.nih.gov/pubmed/37230990 http://dx.doi.org/10.1038/s41598-023-32112-7 |
work_keys_str_mv | AT williamsaustin stochasticgradientdescentforoptimizationfornuclearsystems AT waltonnoah stochasticgradientdescentforoptimizationfornuclearsystems AT maryanskiaustin stochasticgradientdescentforoptimizationfornuclearsystems AT bogeticsandra stochasticgradientdescentforoptimizationfornuclearsystems AT hineswes stochasticgradientdescentforoptimizationfornuclearsystems AT sobesvladimir stochasticgradientdescentforoptimizationfornuclearsystems |