Integrated Energy Cluster (IEC), the regional aggregation of integrated energy systems (IES), has accumulated plenty of dispatchable resources with the development of energy market. This, while significantly providing system Demand Response (DR) potential, also complicates the interaction of the IEC with the main grid and increases the difficulty of system scheduling. To address this issue, this paper proposes a Reinforcement Learning-driven multi-agent hierarchical regulation framework that makes full use of DR to maximize the benefits of both IEC and main grid. Firstly, in the context of the DR market, a mechanism for IECs to bid in the real-time DR market is proposed. Furthermore, an "IEC-main network" hierarchical regulation model taking account of DR is established to minimize the IEC operation cost and maximize the societal benefit. Moreover, an optimization algorithm utilizing Deep Deterministic Policy Gradient (DDPG) with Multi-process (MP) and Priority Experience Replay (PER) mechanism is proposed to allow adaptability to high-latitude and large-scale applications. In case study, the proposed model and algorithm is tested on an 8-node system and a 24-node system. The result indicates that the hierarchical regulation model considering DR can improve the system economy by 2.59 % than that without DR and the improved DDPG algorithm can enhance training effectiveness in comparison with DDPG and PER-DDPG.
Read full abstract