Growing extreme weather-related events have increased the need for a resilient emergency response in the Power Distribution Systems (PDSs). This paper proposes a novel restoration scheme that involves forming Micro-Grids (MGs) and dispatching Mobile Emergency Generators (MEGs) between them. Due to the PDS variability and extensive decision space, conventional model-based approaches aren't suitable for implementing the proposed scheme. Therefore, a model-free framework is designed to control the restoration process. The proposed problem is formulated as a Markov Decision Process (MDP) considering uncertainties associated with load profile, and renewable resources. Deep Q-Network (DQN), a fundamental algorithm in Deep Reinforcement Learning (DRL), is employed to solve the proposed MDP model. DQN algorithm can learn optimal control strategy by interacting with the simulated environment at the training phase and then deploy the learned policy in a real-world application. The main objective of the designed DQN model is to maximize the restored loads and ensure a high level of post-restoration reliability for Critical Loads (CLs). MATLAB software is considered as the simulated environment to perform power flow calculation and MEGs routing. The efficiency and applicability of the proposed method are evaluated by conducting experiments on the modified IEEE 13-bus, 37-bus, and 123-bus networks.
Read full abstract