Cargando…
Path sampling of recurrent neural networks by incorporating known physics
Recurrent neural networks have seen widespread use in modeling dynamical systems in varied domains such as weather prediction, text prediction and several others. Often one wishes to supplement the experimentally observed dynamics with prior knowledge or intuition about the system. While the recurre...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9700810/ https://www.ncbi.nlm.nih.gov/pubmed/36433982 http://dx.doi.org/10.1038/s41467-022-34780-x |
_version_ | 1784839393884766208 |
---|---|
author | Tsai, Sun-Ting Fields, Eric Xu, Yijia Kuo, En-Jui Tiwary, Pratyush |
author_facet | Tsai, Sun-Ting Fields, Eric Xu, Yijia Kuo, En-Jui Tiwary, Pratyush |
author_sort | Tsai, Sun-Ting |
collection | PubMed |
description | Recurrent neural networks have seen widespread use in modeling dynamical systems in varied domains such as weather prediction, text prediction and several others. Often one wishes to supplement the experimentally observed dynamics with prior knowledge or intuition about the system. While the recurrent nature of these networks allows them to model arbitrarily long memories in the time series used in training, it makes it harder to impose prior knowledge or intuition through generic constraints. In this work, we present a path sampling approach based on principle of Maximum Caliber that allows us to include generic thermodynamic or kinetic constraints into recurrent neural networks. We show the method here for a widely used type of recurrent neural network known as long short-term memory network in the context of supplementing time series collected from different application domains. These include classical Molecular Dynamics of a protein and Monte Carlo simulations of an open quantum system continuously losing photons to the environment and displaying Rabi oscillations. Our method can be easily generalized to other generative artificial intelligence models and to generic time series in different areas of physical and social sciences, where one wishes to supplement limited data with intuition or theory based corrections. |
format | Online Article Text |
id | pubmed-9700810 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-97008102022-11-27 Path sampling of recurrent neural networks by incorporating known physics Tsai, Sun-Ting Fields, Eric Xu, Yijia Kuo, En-Jui Tiwary, Pratyush Nat Commun Article Recurrent neural networks have seen widespread use in modeling dynamical systems in varied domains such as weather prediction, text prediction and several others. Often one wishes to supplement the experimentally observed dynamics with prior knowledge or intuition about the system. While the recurrent nature of these networks allows them to model arbitrarily long memories in the time series used in training, it makes it harder to impose prior knowledge or intuition through generic constraints. In this work, we present a path sampling approach based on principle of Maximum Caliber that allows us to include generic thermodynamic or kinetic constraints into recurrent neural networks. We show the method here for a widely used type of recurrent neural network known as long short-term memory network in the context of supplementing time series collected from different application domains. These include classical Molecular Dynamics of a protein and Monte Carlo simulations of an open quantum system continuously losing photons to the environment and displaying Rabi oscillations. Our method can be easily generalized to other generative artificial intelligence models and to generic time series in different areas of physical and social sciences, where one wishes to supplement limited data with intuition or theory based corrections. Nature Publishing Group UK 2022-11-24 /pmc/articles/PMC9700810/ /pubmed/36433982 http://dx.doi.org/10.1038/s41467-022-34780-x Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Tsai, Sun-Ting Fields, Eric Xu, Yijia Kuo, En-Jui Tiwary, Pratyush Path sampling of recurrent neural networks by incorporating known physics |
title | Path sampling of recurrent neural networks by incorporating known physics |
title_full | Path sampling of recurrent neural networks by incorporating known physics |
title_fullStr | Path sampling of recurrent neural networks by incorporating known physics |
title_full_unstemmed | Path sampling of recurrent neural networks by incorporating known physics |
title_short | Path sampling of recurrent neural networks by incorporating known physics |
title_sort | path sampling of recurrent neural networks by incorporating known physics |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9700810/ https://www.ncbi.nlm.nih.gov/pubmed/36433982 http://dx.doi.org/10.1038/s41467-022-34780-x |
work_keys_str_mv | AT tsaisunting pathsamplingofrecurrentneuralnetworksbyincorporatingknownphysics AT fieldseric pathsamplingofrecurrentneuralnetworksbyincorporatingknownphysics AT xuyijia pathsamplingofrecurrentneuralnetworksbyincorporatingknownphysics AT kuoenjui pathsamplingofrecurrentneuralnetworksbyincorporatingknownphysics AT tiwarypratyush pathsamplingofrecurrentneuralnetworksbyincorporatingknownphysics |