Abstract

In this paper, we study multi-residential energy scheduling to minimize total energy costs while satisfying energy demands, where each residence has an energy management system (EMS) that equips energy storage and renewable energy sources and is connected to on-grid energy sources with time-of-use (TOU) and demand charge (DC) tariffs. To this end, we first develop a novel TOU and DC-aware energy scheduling (TDAS) algorithm based on deep reinforcement learning (RL). It learns the policy for a single EMS which optimally determines the amount of on-grid energy consumption considering its stored energy, system uncertainties, and both tariffs without requiring any a priori information about the uncertainties. Based on the TDAS algorithm, we develop a cooperative multi-residential TDAS (Co-TDAS) algorithm using federated RL. In the algorithm, each EMS cooperatively learns a central policy that can be used for any EMS with diverse environments and utilizes it distributedly. Through simulations with real datasets, we demonstrate that our TDAS algorithm achieves competitive performance against state-of-the-art baselines with perfect a priori information. Moreover, it outperforms the baselines even with small errors on the information. We also show that the Co-TDAS algorithm learns the policy that can be utilized in different EMSs, even in newly arrived ones, and accelerates its learning speed via cooperative learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call