Abstract
Distributed energy resources (DERs), such as solar PVs and energy storage, can be used to restore distribution system critical loads after the extreme weather events to increase grid resilience. However, coordinating multiple DERs together with tie-switches for multi-step restoration process under renewable uncertainty is challenging. This paper proposes a deep reinforcement learning to control discrete actions of switching on/off tie switches and DERs for critical load restoration. The restoration problem is first cast into the Markov decision process suitable for DRL. Then, the original soft actor critic (SAC) method for continuous actions has been extended to handle discrete and continuous actions. Numerical comparison results with other stochastic optimization-based approaches on the modified IEEE 33-bus system show that the proposed method can achieve fast critical load restoration in the presence of substation power outage while maintaining system voltage limit throughout the restoration process.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have