Cargando…

Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting

In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational chall...

Descripción completa

Detalles Bibliográficos
Autores principales: Thaler, Stephan, Zavadlav, Julija
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8617111/
https://www.ncbi.nlm.nih.gov/pubmed/34824254
http://dx.doi.org/10.1038/s41467-021-27241-4
_version_ 1784604468481884160
author Thaler, Stephan
Zavadlav, Julija
author_facet Thaler, Stephan
Zavadlav, Julija
author_sort Thaler, Stephan
collection PubMed
description In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational challenges when backpropagating through MD simulations. We present the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent observables. Leveraging thermodynamic perturbation theory, we avoid exploding gradients and achieve around 2 orders of magnitude speed-up in gradient computation for top-down learning. We show effectiveness of DiffTRe in learning NN potentials for an atomistic model of diamond and a coarse-grained model of water based on diverse experimental observables including thermodynamic, structural and mechanical properties. Importantly, DiffTRe also generalizes bottom-up structural coarse-graining methods such as iterative Boltzmann inversion to arbitrary potentials. The presented method constitutes an important milestone towards enriching NN potentials with experimental data, particularly when accurate bottom-up data is unavailable.
format Online
Article
Text
id pubmed-8617111
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-86171112021-12-10 Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting Thaler, Stephan Zavadlav, Julija Nat Commun Article In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational challenges when backpropagating through MD simulations. We present the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent observables. Leveraging thermodynamic perturbation theory, we avoid exploding gradients and achieve around 2 orders of magnitude speed-up in gradient computation for top-down learning. We show effectiveness of DiffTRe in learning NN potentials for an atomistic model of diamond and a coarse-grained model of water based on diverse experimental observables including thermodynamic, structural and mechanical properties. Importantly, DiffTRe also generalizes bottom-up structural coarse-graining methods such as iterative Boltzmann inversion to arbitrary potentials. The presented method constitutes an important milestone towards enriching NN potentials with experimental data, particularly when accurate bottom-up data is unavailable. Nature Publishing Group UK 2021-11-25 /pmc/articles/PMC8617111/ /pubmed/34824254 http://dx.doi.org/10.1038/s41467-021-27241-4 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Thaler, Stephan
Zavadlav, Julija
Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting
title Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting
title_full Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting
title_fullStr Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting
title_full_unstemmed Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting
title_short Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting
title_sort learning neural network potentials from experimental data via differentiable trajectory reweighting
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8617111/
https://www.ncbi.nlm.nih.gov/pubmed/34824254
http://dx.doi.org/10.1038/s41467-021-27241-4
work_keys_str_mv AT thalerstephan learningneuralnetworkpotentialsfromexperimentaldataviadifferentiabletrajectoryreweighting
AT zavadlavjulija learningneuralnetworkpotentialsfromexperimentaldataviadifferentiabletrajectoryreweighting