Cargando…

Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network

Providing a stable, low-price, and safe supply of energy to end-users is a challenging task. The energy service providers are affected by several events such as weather, volatility, and special events. As such, the prediction of these events and having a time window for taking preventive measures ar...

Descripción completa

Detalles Bibliográficos
Autores principales: Ozcan, Alper, Catal, Cagatay, Kasif, Ahmet
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8587894/
https://www.ncbi.nlm.nih.gov/pubmed/34770422
http://dx.doi.org/10.3390/s21217115
_version_ 1784598286716370944
author Ozcan, Alper
Catal, Cagatay
Kasif, Ahmet
author_facet Ozcan, Alper
Catal, Cagatay
Kasif, Ahmet
author_sort Ozcan, Alper
collection PubMed
description Providing a stable, low-price, and safe supply of energy to end-users is a challenging task. The energy service providers are affected by several events such as weather, volatility, and special events. As such, the prediction of these events and having a time window for taking preventive measures are crucial for service providers. Electrical load forecasting can be modeled as a time series prediction problem. One solution is to capture spatial correlations, spatial-temporal relations, and time-dependency of such temporal networks in the time series. Previously, different machine learning methods have been used for time series prediction tasks; however, there is still a need for new research to improve the performance of short-term load forecasting models. In this article, we propose a novel deep learning model to predict electric load consumption using Dual-Stage Attention-Based Recurrent Neural Networks in which the attention mechanism is used in both encoder and decoder stages. The encoder attention layer identifies important features from the input vector, whereas the decoder attention layer is used to overcome the limitations of using a fixed context vector and provides a much longer memory capacity. The proposed model improves the performance for short-term load forecasting (STLF) in terms of the Mean Absolute Error (MAE) and Root Mean Squared Errors (RMSE) scores. To evaluate the predictive performance of the proposed model, the UCI household electric power consumption (HEPC) dataset has been used during the experiments. Experimental results demonstrate that the proposed approach outperforms the previously adopted techniques.
format Online
Article
Text
id pubmed-8587894
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-85878942021-11-13 Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network Ozcan, Alper Catal, Cagatay Kasif, Ahmet Sensors (Basel) Article Providing a stable, low-price, and safe supply of energy to end-users is a challenging task. The energy service providers are affected by several events such as weather, volatility, and special events. As such, the prediction of these events and having a time window for taking preventive measures are crucial for service providers. Electrical load forecasting can be modeled as a time series prediction problem. One solution is to capture spatial correlations, spatial-temporal relations, and time-dependency of such temporal networks in the time series. Previously, different machine learning methods have been used for time series prediction tasks; however, there is still a need for new research to improve the performance of short-term load forecasting models. In this article, we propose a novel deep learning model to predict electric load consumption using Dual-Stage Attention-Based Recurrent Neural Networks in which the attention mechanism is used in both encoder and decoder stages. The encoder attention layer identifies important features from the input vector, whereas the decoder attention layer is used to overcome the limitations of using a fixed context vector and provides a much longer memory capacity. The proposed model improves the performance for short-term load forecasting (STLF) in terms of the Mean Absolute Error (MAE) and Root Mean Squared Errors (RMSE) scores. To evaluate the predictive performance of the proposed model, the UCI household electric power consumption (HEPC) dataset has been used during the experiments. Experimental results demonstrate that the proposed approach outperforms the previously adopted techniques. MDPI 2021-10-27 /pmc/articles/PMC8587894/ /pubmed/34770422 http://dx.doi.org/10.3390/s21217115 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Ozcan, Alper
Catal, Cagatay
Kasif, Ahmet
Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network
title Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network
title_full Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network
title_fullStr Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network
title_full_unstemmed Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network
title_short Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network
title_sort energy load forecasting using a dual-stage attention-based recurrent neural network
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8587894/
https://www.ncbi.nlm.nih.gov/pubmed/34770422
http://dx.doi.org/10.3390/s21217115
work_keys_str_mv AT ozcanalper energyloadforecastingusingadualstageattentionbasedrecurrentneuralnetwork
AT catalcagatay energyloadforecastingusingadualstageattentionbasedrecurrentneuralnetwork
AT kasifahmet energyloadforecastingusingadualstageattentionbasedrecurrentneuralnetwork