Abstract

Integrated Energy Cluster (IEC), the regional aggregation of integrated energy systems (IES), has accumulated plenty of dispatchable resources with the development of energy market. This, while significantly providing system Demand Response (DR) potential, also complicates the interaction of the IEC with the main grid and increases the difficulty of system scheduling. To address this issue, this paper proposes a Reinforcement Learning-driven multi-agent hierarchical regulation framework that makes full use of DR to maximize the benefits of both IEC and main grid. Firstly, in the context of the DR market, a mechanism for IECs to bid in the real-time DR market is proposed. Furthermore, an "IEC-main network" hierarchical regulation model taking account of DR is established to minimize the IEC operation cost and maximize the societal benefit. Moreover, an optimization algorithm utilizing Deep Deterministic Policy Gradient (DDPG) with Multi-process (MP) and Priority Experience Replay (PER) mechanism is proposed to allow adaptability to high-latitude and large-scale applications. In case study, the proposed model and algorithm is tested on an 8-node system and a 24-node system. The result indicates that the hierarchical regulation model considering DR can improve the system economy by 2.59 % than that without DR and the improved DDPG algorithm can enhance training effectiveness in comparison with DDPG and PER-DDPG.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.