Cargando…

A Temporal Window Attention-Based Window-Dependent Long Short-Term Memory Network for Multivariate Time Series Prediction

Multivariate time series prediction models perform the required operation on a specific window length of a given input. However, capturing complex and nonlinear interdependencies in each temporal window remains challenging. The typical attention mechanisms assign a weight for a variable at the same...

Descripción completa

Detalles Bibliográficos
Autores principales: Han, Shuang, Dong, Hongbin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9858386/
https://www.ncbi.nlm.nih.gov/pubmed/36673150
http://dx.doi.org/10.3390/e25010010
Descripción
Sumario:Multivariate time series prediction models perform the required operation on a specific window length of a given input. However, capturing complex and nonlinear interdependencies in each temporal window remains challenging. The typical attention mechanisms assign a weight for a variable at the same time or the features of each previous time step to capture spatio-temporal correlations. However, it fails to directly extract each time step’s relevant features that affect future values to learn the spatio-temporal pattern from a global perspective. To this end, a temporal window attention-based window-dependent long short-term memory network (TWA-WDLSTM) is proposed to enhance the temporal dependencies, which exploits the encoder–decoder framework. In the encoder, we design a temporal window attention mechanism to select relevant exogenous series in a temporal window. Furthermore, we introduce a window-dependent long short-term memory network (WDLSTM) to encode the input sequences in a temporal window into a feature representation and capture very long term dependencies. In the decoder, we use WDLSTM to generate the prediction values. We applied our model to four real-world datasets in comparison to a variety of state-of-the-art models. The experimental results suggest that TWA-WDLSTM can outperform comparison models. In addition, the temporal window attention mechanism has good interpretability. We can observe which variable contributes to the future value.