Abstract

With the increasing penetration of electric vehicles (EVs), the orderly charging–discharging (C–D) strategy of EVs can effectively alleviate the impact of large-scale grid-connected EVs. However, the schedule of the C–D system of EV clusters encounters the curse of dimensionality problem. In addition, the performance of the C–D control strategy faces great challenges of environmental uncertainty of user demand and electricity price. This paper proposes an EV cluster scheduling strategy considering real-time electricity prices based on deep reinforcement learning. Firstly, we establish a distributed real-time optimal scheduling structure according to the real-time price signals of distribution system operators (DSO). Furthermore, to alleviate the curse of dimensionality, we propose a C–D model of a single EV according to the C–D characteristics of EV, and we establish the C–D control model of EVs as a Markov decision process (MDP). Finally, to adapt to the uncertainty of the learning environment, we propose a model-based deep reinforcement learning to optimize the C–D behavior of EVs. After day-ahead training and parameter saving of the proposed model, the C–D scheduling strategy is generated for the real-time system state at each moment of the day. The simulation results of the C–D scheduling strategy for cost-oriented EV charging show that the proposed scheduling strategy effectively reduces the user charging cost by 133.7 dollars and the load peak–valley difference, stabilizes the load fluctuation, and achieves the win–win situation between the power grid and EV users.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call