Abstract

The rapid growth of the number of electric vehicles (EVs) has significantly increased the demand for electricity of residents, which may lead to transformer overload of distribution networks if there is no reasonable power consumption planning. In view of the coordinated charging of EVs under the photovoltaic (PV) energy connected to the power grid, a collaborative charging control strategy is proposed based on a double deep q-network with Prioritized experience replay (DDQN-PER). Specifically, Long–Short-Term Memory (LSTM) neural network is utilized to capture the uncertainties caused by power requirement, PV power, and real-time electricity price. Then, a modified DDQN approach and the corresponding reward function are applied to solve the cooperative charging problem. In addition, the introduction of a PER mechanism based on TD bias sampling in reinforcement learning alleviates the problem of reward sparsity in learning scenarios and improves the training stability and efficiency. The proposed approach can achieve collaborative scheduling of EVs, which helps reduce the charging cost, promote the consumption of PV energy and avoid transformer overload. Simulation case studies based on real data demonstrate that the proposed algorithm can reduce the charging cost by 30% and increase the PV power utilization by 10% compared to the DDQN algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call