Reliability and cost-effectiveness in the operation of the multiple microgrid (MMG) system depend on the skillful management of its energy resources. Traditional energy management approaches are physics-model-based, which rely on the precise system parameters (e.g., line parameters) of the electricity and heat network. It is difficult to establish the precise system parameters in practice because they depend on a variety of factors. In this context, this paper proposes a physics-model-free control framework for the energy management of MMGs with heat-electricity energy, consisting of proposed surrogate model and multi-agent deep reinforcement learning (MADRL) approach. An important step is to use historical data to train a surrogate model in supervised manner that can imitate the realistic power and thermal flow calculations. Meanwhile, the energy management problem is reformulated as a Markov game. It is solved by the proposed MADRL-based approach by modeling each MG controller as an agent with a specific objective. The historical trajectories representation, parameter space technology, and deep dense architecture in reinforcement learning are introduced in MADRL to overcome the negative impact brought by the time series data from the input state on the decision-making process and construct an efficient exploration mechanism to overcome inefficient optimization of the MMG system in a multi-agent setting. During training period of MADRL, the trained surrogate models integrate into the environment of the MADRL, which can develop optimal energy management strategy based on the continuous interaction with the surrogate models. The proposed surrogate model enabled MADRL approach can reduce the reliance on precise physical systems and prevent having an impact on the real system while being trained involving trial and error process. Simulation results demonstrate the effectiveness of the proposed control framework.
Read full abstract