Cargando…
Simple lessons from complex learning: what a neural network model learns about cosmic structure formation
We train a neural network model to predict the full phase space evolution of cosmological N-body simulations. Its success implies that the neural network model is accurately approximating the Green’s function expansion that relates the initial conditions of the simulations to its outcome at later ti...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Oxford University Press
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10121177/ https://www.ncbi.nlm.nih.gov/pubmed/37091548 http://dx.doi.org/10.1093/pnasnexus/pgac250 |
_version_ | 1785029327889367040 |
---|---|
author | Jamieson, Drew Li, Yin He, Siyu Villaescusa-Navarro, Francisco Ho, Shirley de Oliveira, Renan Alves Spergel, David N |
author_facet | Jamieson, Drew Li, Yin He, Siyu Villaescusa-Navarro, Francisco Ho, Shirley de Oliveira, Renan Alves Spergel, David N |
author_sort | Jamieson, Drew |
collection | PubMed |
description | We train a neural network model to predict the full phase space evolution of cosmological N-body simulations. Its success implies that the neural network model is accurately approximating the Green’s function expansion that relates the initial conditions of the simulations to its outcome at later times in the deeply nonlinear regime. We test the accuracy of this approximation by assessing its performance on well-understood simple cases that have either known exact solutions or well-understood expansions. These scenarios include spherical configurations, isolated plane waves, and two interacting plane waves: initial conditions that are very different from the Gaussian random fields used for training. We find our model generalizes well to these well-understood scenarios, demonstrating that the networks have inferred general physical principles and learned the nonlinear mode couplings from the complex, random Gaussian training data. These tests also provide a useful diagnostic for finding the model’s strengths and weaknesses, and identifying strategies for model improvement. We also test the model on initial conditions that contain only transverse modes, a family of modes that differ not only in their phases but also in their evolution from the longitudinal growing modes used in the training set. When the network encounters these initial conditions that are orthogonal to the training set, the model fails completely. In addition to these simple configurations, we evaluate the model’s predictions for the density, displacement, and momentum power spectra with standard initial conditions for N-body simulations. We compare these summary statistics against N-body results and an approximate, fast simulation method called COLA (COmoving Lagrangian Acceleration). Our model achieves percent level accuracy at nonlinear scales of [Formula: see text] , representing a significant improvement over COLA. |
format | Online Article Text |
id | pubmed-10121177 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Oxford University Press |
record_format | MEDLINE/PubMed |
spelling | pubmed-101211772023-04-22 Simple lessons from complex learning: what a neural network model learns about cosmic structure formation Jamieson, Drew Li, Yin He, Siyu Villaescusa-Navarro, Francisco Ho, Shirley de Oliveira, Renan Alves Spergel, David N PNAS Nexus Physical Sciences and Engineering We train a neural network model to predict the full phase space evolution of cosmological N-body simulations. Its success implies that the neural network model is accurately approximating the Green’s function expansion that relates the initial conditions of the simulations to its outcome at later times in the deeply nonlinear regime. We test the accuracy of this approximation by assessing its performance on well-understood simple cases that have either known exact solutions or well-understood expansions. These scenarios include spherical configurations, isolated plane waves, and two interacting plane waves: initial conditions that are very different from the Gaussian random fields used for training. We find our model generalizes well to these well-understood scenarios, demonstrating that the networks have inferred general physical principles and learned the nonlinear mode couplings from the complex, random Gaussian training data. These tests also provide a useful diagnostic for finding the model’s strengths and weaknesses, and identifying strategies for model improvement. We also test the model on initial conditions that contain only transverse modes, a family of modes that differ not only in their phases but also in their evolution from the longitudinal growing modes used in the training set. When the network encounters these initial conditions that are orthogonal to the training set, the model fails completely. In addition to these simple configurations, we evaluate the model’s predictions for the density, displacement, and momentum power spectra with standard initial conditions for N-body simulations. We compare these summary statistics against N-body results and an approximate, fast simulation method called COLA (COmoving Lagrangian Acceleration). Our model achieves percent level accuracy at nonlinear scales of [Formula: see text] , representing a significant improvement over COLA. Oxford University Press 2022-11-09 /pmc/articles/PMC10121177/ /pubmed/37091548 http://dx.doi.org/10.1093/pnasnexus/pgac250 Text en © The Author(s) 2022. Published by Oxford University Press on behalf of National Academy of Sciences. https://creativecommons.org/licenses/by/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Physical Sciences and Engineering Jamieson, Drew Li, Yin He, Siyu Villaescusa-Navarro, Francisco Ho, Shirley de Oliveira, Renan Alves Spergel, David N Simple lessons from complex learning: what a neural network model learns about cosmic structure formation |
title | Simple lessons from complex learning: what a neural network model learns about cosmic structure formation |
title_full | Simple lessons from complex learning: what a neural network model learns about cosmic structure formation |
title_fullStr | Simple lessons from complex learning: what a neural network model learns about cosmic structure formation |
title_full_unstemmed | Simple lessons from complex learning: what a neural network model learns about cosmic structure formation |
title_short | Simple lessons from complex learning: what a neural network model learns about cosmic structure formation |
title_sort | simple lessons from complex learning: what a neural network model learns about cosmic structure formation |
topic | Physical Sciences and Engineering |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10121177/ https://www.ncbi.nlm.nih.gov/pubmed/37091548 http://dx.doi.org/10.1093/pnasnexus/pgac250 |
work_keys_str_mv | AT jamiesondrew simplelessonsfromcomplexlearningwhataneuralnetworkmodellearnsaboutcosmicstructureformation AT liyin simplelessonsfromcomplexlearningwhataneuralnetworkmodellearnsaboutcosmicstructureformation AT hesiyu simplelessonsfromcomplexlearningwhataneuralnetworkmodellearnsaboutcosmicstructureformation AT villaescusanavarrofrancisco simplelessonsfromcomplexlearningwhataneuralnetworkmodellearnsaboutcosmicstructureformation AT hoshirley simplelessonsfromcomplexlearningwhataneuralnetworkmodellearnsaboutcosmicstructureformation AT deoliveirarenanalves simplelessonsfromcomplexlearningwhataneuralnetworkmodellearnsaboutcosmicstructureformation AT spergeldavidn simplelessonsfromcomplexlearningwhataneuralnetworkmodellearnsaboutcosmicstructureformation |