Cargando…

Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting

Traditional machine-learning methods are inefficient in capturing chaos in nonlinear dynamical systems, especially when the time difference [Formula: see text] between consecutive steps is so large that the extracted time series looks apparently random. Here, we introduce a new long-short-term-memor...

Descripción completa

Detalles Bibliográficos
Autores principales: Meng, Xiangyi, Yang, Tong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8626053/
https://www.ncbi.nlm.nih.gov/pubmed/34828189
http://dx.doi.org/10.3390/e23111491
_version_ 1784606572338479104
author Meng, Xiangyi
Yang, Tong
author_facet Meng, Xiangyi
Yang, Tong
author_sort Meng, Xiangyi
collection PubMed
description Traditional machine-learning methods are inefficient in capturing chaos in nonlinear dynamical systems, especially when the time difference [Formula: see text] between consecutive steps is so large that the extracted time series looks apparently random. Here, we introduce a new long-short-term-memory (LSTM)-based recurrent architecture by tensorizing the cell-state-to-state propagation therein, maintaining the long-term memory feature of LSTM, while simultaneously enhancing the learning of short-term nonlinear complexity. We stress that the global minima of training can be most efficiently reached by our tensor structure where all nonlinear terms, up to some polynomial order, are treated explicitly and weighted equally. The efficiency and generality of our architecture are systematically investigated and tested through theoretical analysis and experimental examinations. In our design, we have explicitly used two different many-body entanglement structures—matrix product states (MPS) and the multiscale entanglement renormalization ansatz (MERA)—as physics-inspired tensor decomposition techniques, from which we find that MERA generally performs better than MPS, hence conjecturing that the learnability of chaos is determined not only by the number of free parameters but also the tensor complexity—recognized as how entanglement entropy scales with varying matricization of the tensor.
format Online
Article
Text
id pubmed-8626053
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-86260532021-11-27 Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting Meng, Xiangyi Yang, Tong Entropy (Basel) Article Traditional machine-learning methods are inefficient in capturing chaos in nonlinear dynamical systems, especially when the time difference [Formula: see text] between consecutive steps is so large that the extracted time series looks apparently random. Here, we introduce a new long-short-term-memory (LSTM)-based recurrent architecture by tensorizing the cell-state-to-state propagation therein, maintaining the long-term memory feature of LSTM, while simultaneously enhancing the learning of short-term nonlinear complexity. We stress that the global minima of training can be most efficiently reached by our tensor structure where all nonlinear terms, up to some polynomial order, are treated explicitly and weighted equally. The efficiency and generality of our architecture are systematically investigated and tested through theoretical analysis and experimental examinations. In our design, we have explicitly used two different many-body entanglement structures—matrix product states (MPS) and the multiscale entanglement renormalization ansatz (MERA)—as physics-inspired tensor decomposition techniques, from which we find that MERA generally performs better than MPS, hence conjecturing that the learnability of chaos is determined not only by the number of free parameters but also the tensor complexity—recognized as how entanglement entropy scales with varying matricization of the tensor. MDPI 2021-11-11 /pmc/articles/PMC8626053/ /pubmed/34828189 http://dx.doi.org/10.3390/e23111491 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Meng, Xiangyi
Yang, Tong
Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting
title Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting
title_full Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting
title_fullStr Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting
title_full_unstemmed Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting
title_short Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting
title_sort entanglement-structured lstm boosts chaotic time series forecasting
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8626053/
https://www.ncbi.nlm.nih.gov/pubmed/34828189
http://dx.doi.org/10.3390/e23111491
work_keys_str_mv AT mengxiangyi entanglementstructuredlstmboostschaotictimeseriesforecasting
AT yangtong entanglementstructuredlstmboostschaotictimeseriesforecasting