Cargando…

DRCNN: decomposing residual convolutional neural networks for time series forecasting

Recent studies have shown great performance of Transformer-based models in long-term time series forecasting due to their ability in capturing long-term dependencies. However, Transformers have their limitations when training on small datasets because of their lack in necessary inductive bias for ti...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhu, Yuzhen, Luo, Shaojie, Huang, Di, Zheng, Weiyan, Su, Fang, Hou, Beiping
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10517921/
https://www.ncbi.nlm.nih.gov/pubmed/37741848
http://dx.doi.org/10.1038/s41598-023-42815-6
_version_ 1785109400154800128
author Zhu, Yuzhen
Luo, Shaojie
Huang, Di
Zheng, Weiyan
Su, Fang
Hou, Beiping
author_facet Zhu, Yuzhen
Luo, Shaojie
Huang, Di
Zheng, Weiyan
Su, Fang
Hou, Beiping
author_sort Zhu, Yuzhen
collection PubMed
description Recent studies have shown great performance of Transformer-based models in long-term time series forecasting due to their ability in capturing long-term dependencies. However, Transformers have their limitations when training on small datasets because of their lack in necessary inductive bias for time series forecasting, and do not show significant benefits in short-time step forecasting as well as that in long-time step as the continuity of sequence is not focused on. In this paper, efficient designs in Transformers are reviewed and a design of decomposing residual convolution neural networks or DRCNN is proposed. The DRCNN method allows to utilize the continuity between data by decomposing data into residual and trend terms which are processed by a designed convolution block or DR-Block. DR-Block has its strength in extracting features by following the structural design of Transformers. In addition, by imitating the multi-head in Transformers, a Multi-head Sequence method is proposed such that the network is enabled to receive longer inputs and more accurate forecasts are obtained. The state-of-the-art performance of the presented model are demonstrated on several datasets.
format Online
Article
Text
id pubmed-10517921
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-105179212023-09-25 DRCNN: decomposing residual convolutional neural networks for time series forecasting Zhu, Yuzhen Luo, Shaojie Huang, Di Zheng, Weiyan Su, Fang Hou, Beiping Sci Rep Article Recent studies have shown great performance of Transformer-based models in long-term time series forecasting due to their ability in capturing long-term dependencies. However, Transformers have their limitations when training on small datasets because of their lack in necessary inductive bias for time series forecasting, and do not show significant benefits in short-time step forecasting as well as that in long-time step as the continuity of sequence is not focused on. In this paper, efficient designs in Transformers are reviewed and a design of decomposing residual convolution neural networks or DRCNN is proposed. The DRCNN method allows to utilize the continuity between data by decomposing data into residual and trend terms which are processed by a designed convolution block or DR-Block. DR-Block has its strength in extracting features by following the structural design of Transformers. In addition, by imitating the multi-head in Transformers, a Multi-head Sequence method is proposed such that the network is enabled to receive longer inputs and more accurate forecasts are obtained. The state-of-the-art performance of the presented model are demonstrated on several datasets. Nature Publishing Group UK 2023-09-23 /pmc/articles/PMC10517921/ /pubmed/37741848 http://dx.doi.org/10.1038/s41598-023-42815-6 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Zhu, Yuzhen
Luo, Shaojie
Huang, Di
Zheng, Weiyan
Su, Fang
Hou, Beiping
DRCNN: decomposing residual convolutional neural networks for time series forecasting
title DRCNN: decomposing residual convolutional neural networks for time series forecasting
title_full DRCNN: decomposing residual convolutional neural networks for time series forecasting
title_fullStr DRCNN: decomposing residual convolutional neural networks for time series forecasting
title_full_unstemmed DRCNN: decomposing residual convolutional neural networks for time series forecasting
title_short DRCNN: decomposing residual convolutional neural networks for time series forecasting
title_sort drcnn: decomposing residual convolutional neural networks for time series forecasting
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10517921/
https://www.ncbi.nlm.nih.gov/pubmed/37741848
http://dx.doi.org/10.1038/s41598-023-42815-6
work_keys_str_mv AT zhuyuzhen drcnndecomposingresidualconvolutionalneuralnetworksfortimeseriesforecasting
AT luoshaojie drcnndecomposingresidualconvolutionalneuralnetworksfortimeseriesforecasting
AT huangdi drcnndecomposingresidualconvolutionalneuralnetworksfortimeseriesforecasting
AT zhengweiyan drcnndecomposingresidualconvolutionalneuralnetworksfortimeseriesforecasting
AT sufang drcnndecomposingresidualconvolutionalneuralnetworksfortimeseriesforecasting
AT houbeiping drcnndecomposingresidualconvolutionalneuralnetworksfortimeseriesforecasting