Cargando…

Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach

This paper presents a hierarchical deep reinforcement learning (DRL) method for the scheduling of energy consumptions of smart home appliances and distributed energy resources (DERs) including an energy storage system (ESS) and an electric vehicle (EV). Compared to Q-learning algorithms based on a d...

Descripción completa

Detalles Bibliográficos
Autores principales: Lee, Sangyoon, Choi, Dae-Hyun
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7180495/
https://www.ncbi.nlm.nih.gov/pubmed/32290345
http://dx.doi.org/10.3390/s20072157
_version_ 1783525832759705600
author Lee, Sangyoon
Choi, Dae-Hyun
author_facet Lee, Sangyoon
Choi, Dae-Hyun
author_sort Lee, Sangyoon
collection PubMed
description This paper presents a hierarchical deep reinforcement learning (DRL) method for the scheduling of energy consumptions of smart home appliances and distributed energy resources (DERs) including an energy storage system (ESS) and an electric vehicle (EV). Compared to Q-learning algorithms based on a discrete action space, the novelty of the proposed approach is that the energy consumptions of home appliances and DERs are scheduled in a continuous action space using an actor–critic-based DRL method. To this end, a two-level DRL framework is proposed where home appliances are scheduled at the first level according to the consumer’s preferred appliance scheduling and comfort level, while the charging and discharging schedules of ESS and EV are calculated at the second level using the optimal solution from the first level along with the consumer environmental characteristics. A simulation study is performed in a single home with an air conditioner, a washing machine, a rooftop solar photovoltaic system, an ESS, and an EV under a time-of-use pricing. Numerical examples under different weather conditions, weekday/weekend, and driving patterns of the EV confirm the effectiveness of the proposed approach in terms of total cost of electricity, state of energy of the ESS and EV, and consumer preference.
format Online
Article
Text
id pubmed-7180495
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-71804952020-05-01 Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach Lee, Sangyoon Choi, Dae-Hyun Sensors (Basel) Article This paper presents a hierarchical deep reinforcement learning (DRL) method for the scheduling of energy consumptions of smart home appliances and distributed energy resources (DERs) including an energy storage system (ESS) and an electric vehicle (EV). Compared to Q-learning algorithms based on a discrete action space, the novelty of the proposed approach is that the energy consumptions of home appliances and DERs are scheduled in a continuous action space using an actor–critic-based DRL method. To this end, a two-level DRL framework is proposed where home appliances are scheduled at the first level according to the consumer’s preferred appliance scheduling and comfort level, while the charging and discharging schedules of ESS and EV are calculated at the second level using the optimal solution from the first level along with the consumer environmental characteristics. A simulation study is performed in a single home with an air conditioner, a washing machine, a rooftop solar photovoltaic system, an ESS, and an EV under a time-of-use pricing. Numerical examples under different weather conditions, weekday/weekend, and driving patterns of the EV confirm the effectiveness of the proposed approach in terms of total cost of electricity, state of energy of the ESS and EV, and consumer preference. MDPI 2020-04-10 /pmc/articles/PMC7180495/ /pubmed/32290345 http://dx.doi.org/10.3390/s20072157 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Lee, Sangyoon
Choi, Dae-Hyun
Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach
title Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach
title_full Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach
title_fullStr Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach
title_full_unstemmed Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach
title_short Energy Management of Smart Home with Home Appliances, Energy Storage System and Electric Vehicle: A Hierarchical Deep Reinforcement Learning Approach
title_sort energy management of smart home with home appliances, energy storage system and electric vehicle: a hierarchical deep reinforcement learning approach
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7180495/
https://www.ncbi.nlm.nih.gov/pubmed/32290345
http://dx.doi.org/10.3390/s20072157
work_keys_str_mv AT leesangyoon energymanagementofsmarthomewithhomeappliancesenergystoragesystemandelectricvehicleahierarchicaldeepreinforcementlearningapproach
AT choidaehyun energymanagementofsmarthomewithhomeappliancesenergystoragesystemandelectricvehicleahierarchicaldeepreinforcementlearningapproach