Abstract

AbstractReducing carbon emissions is a crucial way to achieve the goal of green and sustainable development. To accomplish this goal, electric vehicles (EVs) are considered system‐schedulable energy storage devices, suppressing the negative impact of the randomness and fluctuation of renewable energy on the system's operation. In this paper, a coordination control strategy aimed at minimising the carbon emissions of a distribution network between EVs, energy storage devices, and static var compensators (SVCs) is proposed. A model‐free deep reinforcement learning (DRL)‐based approach is developed to learn the optimal control strategy with the constraint of avoiding system overload caused by random EV access. The twin‐delayed deep deterministic policy gradient (TD3) framework is applied to design the learning method. After the model learning is completed, the neural network can quickly generate a real‐time low‐carbon scheduling strategy according to the system operating situation. Finally, simulation on the IEEE 33‐bus system verifies the effectiveness and robustness of this method. On the premise of meeting the charging demand of electric vehicles, this method can optimise the system operation by controlling the charge‐discharge process of EVs, effectively absorbing the renewable energy in the system and reducing the carbon emissions of the system operation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.