Cargando…

Extensive deep neural networks for transferring small scale learning to large scale systems

We present a physically-motivated topology of a deep neural network that can efficiently infer extensive parameters (such as energy, entropy, or number of particles) of arbitrarily large systems, doing so with [Image: see text] scaling. We use a form of domain decomposition for training and inferenc...

Descripción completa

Detalles Bibliográficos
Autores principales: Mills, Kyle, Ryczko, Kevin, Luchak, Iryna, Domurad, Adam, Beeler, Chris, Tamblyn, Isaac
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Royal Society of Chemistry 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6460955/
https://www.ncbi.nlm.nih.gov/pubmed/31015950
http://dx.doi.org/10.1039/c8sc04578j
_version_ 1783410413574029312
author Mills, Kyle
Ryczko, Kevin
Luchak, Iryna
Domurad, Adam
Beeler, Chris
Tamblyn, Isaac
author_facet Mills, Kyle
Ryczko, Kevin
Luchak, Iryna
Domurad, Adam
Beeler, Chris
Tamblyn, Isaac
author_sort Mills, Kyle
collection PubMed
description We present a physically-motivated topology of a deep neural network that can efficiently infer extensive parameters (such as energy, entropy, or number of particles) of arbitrarily large systems, doing so with [Image: see text] scaling. We use a form of domain decomposition for training and inference, where each sub-domain (tile) is comprised of a non-overlapping focus region surrounded by an overlapping context region. The size of these regions is motivated by the physical interaction length scales of the problem. We demonstrate the application of EDNNs to three physical systems: the Ising model and two hexagonal/graphene-like datasets. In the latter, an EDNN was able to make total energy predictions of a 60 atoms system, with comparable accuracy to density functional theory (DFT), in 57 milliseconds. Additionally EDNNs are well suited for massively parallel evaluation, as no communication is necessary during neural network evaluation. We demonstrate that EDNNs can be used to make an energy prediction of a two-dimensional 35.2 million atom system, over 1.0 μm(2) of material, at an accuracy comparable to DFT, in under 25 minutes. Such a system exists on a length scale visible with optical microscopy and larger than some living organisms.
format Online
Article
Text
id pubmed-6460955
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Royal Society of Chemistry
record_format MEDLINE/PubMed
spelling pubmed-64609552019-04-23 Extensive deep neural networks for transferring small scale learning to large scale systems Mills, Kyle Ryczko, Kevin Luchak, Iryna Domurad, Adam Beeler, Chris Tamblyn, Isaac Chem Sci Chemistry We present a physically-motivated topology of a deep neural network that can efficiently infer extensive parameters (such as energy, entropy, or number of particles) of arbitrarily large systems, doing so with [Image: see text] scaling. We use a form of domain decomposition for training and inference, where each sub-domain (tile) is comprised of a non-overlapping focus region surrounded by an overlapping context region. The size of these regions is motivated by the physical interaction length scales of the problem. We demonstrate the application of EDNNs to three physical systems: the Ising model and two hexagonal/graphene-like datasets. In the latter, an EDNN was able to make total energy predictions of a 60 atoms system, with comparable accuracy to density functional theory (DFT), in 57 milliseconds. Additionally EDNNs are well suited for massively parallel evaluation, as no communication is necessary during neural network evaluation. We demonstrate that EDNNs can be used to make an energy prediction of a two-dimensional 35.2 million atom system, over 1.0 μm(2) of material, at an accuracy comparable to DFT, in under 25 minutes. Such a system exists on a length scale visible with optical microscopy and larger than some living organisms. Royal Society of Chemistry 2019-03-20 /pmc/articles/PMC6460955/ /pubmed/31015950 http://dx.doi.org/10.1039/c8sc04578j Text en This journal is © The Royal Society of Chemistry 2019 http://creativecommons.org/licenses/by/3.0/ This article is freely available. This article is licensed under a Creative Commons Attribution 3.0 Unported Licence (CC BY 3.0)
spellingShingle Chemistry
Mills, Kyle
Ryczko, Kevin
Luchak, Iryna
Domurad, Adam
Beeler, Chris
Tamblyn, Isaac
Extensive deep neural networks for transferring small scale learning to large scale systems
title Extensive deep neural networks for transferring small scale learning to large scale systems
title_full Extensive deep neural networks for transferring small scale learning to large scale systems
title_fullStr Extensive deep neural networks for transferring small scale learning to large scale systems
title_full_unstemmed Extensive deep neural networks for transferring small scale learning to large scale systems
title_short Extensive deep neural networks for transferring small scale learning to large scale systems
title_sort extensive deep neural networks for transferring small scale learning to large scale systems
topic Chemistry
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6460955/
https://www.ncbi.nlm.nih.gov/pubmed/31015950
http://dx.doi.org/10.1039/c8sc04578j
work_keys_str_mv AT millskyle extensivedeepneuralnetworksfortransferringsmallscalelearningtolargescalesystems
AT ryczkokevin extensivedeepneuralnetworksfortransferringsmallscalelearningtolargescalesystems
AT luchakiryna extensivedeepneuralnetworksfortransferringsmallscalelearningtolargescalesystems
AT domuradadam extensivedeepneuralnetworksfortransferringsmallscalelearningtolargescalesystems
AT beelerchris extensivedeepneuralnetworksfortransferringsmallscalelearningtolargescalesystems
AT tamblynisaac extensivedeepneuralnetworksfortransferringsmallscalelearningtolargescalesystems