Abstract

With the growing popularity of electric vehicles (EVs), it is a new challenge for the residential microgrid system to conduct charging scheduling to meet the charging demands of EVs while maximizing its profit. In this work, a safe reinforcement learning (RL)-based charging scheduling strategy is proposed to meet this challenge. We construct a complete microgrid system equipped with a large charging station and consider different types of EVs, as well as the vehicle-to-grid (V2G) mode and nonlinear charging characteristics of EVs. Subsequently, the charging scheduling problem is formulated as a constrained Markov decision process (CMDP) due to the various limitations of power and demands. To effectively capture the uncertainties of the supply side and demand side of the microgrid, a model-free RL framework is employed. However, the curse of dimensionality of the action space is inevitable as EVs increase. To solve this problem, a charging and discharging strategy based on a general ladder electricity pricing scheme is designed. Different EVs are divided into different sets according to their states under this strategy, and the agent gives control signals of different sets instead of controlling each EV individually, which effectively reduces the dimension of the action space. Subsequently, a constrained soft actor-critic (CSAC) algorithm is designed to solve the established CMDP, and a safety filter is introduced to ensure safety. In the end, a numerical case is conducted to verify the effectiveness of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call