Abstract

This paper presents a hierarchical deep reinforcement learning (DRL) method for the scheduling of energy consumptions of smart home appliances and distributed energy resources (DERs) including an energy storage system (ESS) and an electric vehicle (EV). Compared to Q-learning algorithms based on a discrete action space, the novelty of the proposed approach is that the energy consumptions of home appliances and DERs are scheduled in a continuous action space using an actor–critic-based DRL method. To this end, a two-level DRL framework is proposed where home appliances are scheduled at the first level according to the consumer’s preferred appliance scheduling and comfort level, while the charging and discharging schedules of ESS and EV are calculated at the second level using the optimal solution from the first level along with the consumer environmental characteristics. A simulation study is performed in a single home with an air conditioner, a washing machine, a rooftop solar photovoltaic system, an ESS, and an EV under a time-of-use pricing. Numerical examples under different weather conditions, weekday/weekend, and driving patterns of the EV confirm the effectiveness of the proposed approach in terms of total cost of electricity, state of energy of the ESS and EV, and consumer preference.

Highlights

  • 30 percent of the United States’ total energy consumption comes from the residential sector, and the amount of the residential energy consumption is expected to grow owing to increased use of home appliances (e.g., air conditioners (ACs) and washing machines (WMs)) and modern electronic devices [1]

  • These studies address the scheduling of the energy consumption for home appliances and distributed energy resources (DERs), while maintaining the consumer’s comfort level using mixed-integer nonlinear programming (MINLP) [2], the load scheduling using mixed-integer linear programming (MILP) for single and multiple households [3,4], robust optimization for scheduling of home appliances to resolve the uncertainty of consumer behavior [5], and distributed home energy management systems (HEMSs) architectures consisting of local and global HEMSs [6]

  • Compared to the existing method using Q-learning in a discrete action space, we propose a hierarchical deep reinforcement learning (DRL) in a continuous action space with the following two scheduling steps: (i) the controllable appliances including WM and AC are scheduled at the first level according to the consumer’s preferred appliance scheduling and comfort level; (ii) energy storage system (ESS) and electric vehicle (EV) are scheduled at the second level, thereby resulting in optimal cost of electricity for a household

Read more

Summary

Introduction

30 percent of the United States’ total energy consumption comes from the residential sector, and the amount of the residential energy consumption is expected to grow owing to increased use of home appliances (e.g., air conditioners (ACs) and washing machines (WMs)) and modern electronic devices [1]. A primary goal of HEMS is to reduce the electricity bill of consumers while satisfying their comforts and preferences To achieve this goal, HEMSs perform the following two functions: Sensors 2020, 20, 2157; doi:10.3390/s20072157 www.mdpi.com/journal/sensors (1) real-time monitoring of the energy usage of consumers using smart meters; (2) scheduling of the optimal energy consumption of home appliances. HEMSs perform the following two functions: Sensors 2020, 20, 2157; doi:10.3390/s20072157 www.mdpi.com/journal/sensors (1) real-time monitoring of the energy usage of consumers using smart meters; (2) scheduling of the optimal energy consumption of home appliances To implement this second function, an HEMS algorithm is generally formulated as a model-based optimization problem. A shiftable appliance has two types of load: (i) a non-interruptible load

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call