Abstract

With the continuous integration of intermittent renewable energy and large-scale regional interconnection of power systems, the power system has evolved into a high dimensional complex nonlinear system. Undoubtedly, in such a highly complex system, more intelligence and flexibility for the operation, control and decision-making would be required. Therefore, a novel deep reinforcement learning (DRL) framework for automatic operation control (AOC) of the power system considering the existence of extreme weather events is proposed in this paper. In order to reduce the destructive impact of extreme weather events, topology switching control must be utilized except for the generator redispatching and load shedding, which poses a huge challenge to the optimization of the DRL algorithm. To tackle this problem, a novel action space reduction method which utilizes the domain knowledge, historical data and heuristic constraints, is firstly proposed. Then, imitation learning (IL) is introduced to pre-train the DRL agent and build a feasible topology database for extreme weather events. Finally, the Soft Actor-Critic (SAC) algorithm with improved exploration and policy update strategy is developed to train the agent. Numerical studies in a modified 36-bus system indicate that the proposed model and method have good convergence characteristics and solving efficiency.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.