Abstract

Hybrid electricity-heat-hydrogen energy system with demand response (DR) is promising in enhancing flexibility and energy efficiency. However, the multi-energy coupling and source-load uncertainties makes it challenging to efficiently schedule energy flows of electricity generation, storage and DR. To this end, this paper proposes a continues deep reinforcement learning algorithm, specifically the deep deterministic policy gradient (DDPG), for the energy management optimization. Different Markov decision processes are firstly employed to analyze and compare two kinds of incentive-based electro-thermal DR contracts, i.e., load curtailment and load shifting. Simulation results exemplify the superiority of the proposed DDPG-based scheduling incorporating electro-thermal DR in terms of economy and sustainability, leading to a 16.02 % reduction in scheduling costs for contract load curtailment and an 8.52 % reduction for contract load shifting when compared to that without DR consideration. Furthermore, the robustness of DDPG-based scheduling is verified under 60 random source-load scenarios compared with different algorithms. Compared to results obtained by DDPG, DDPG-LC and DDPG-LS reduce the mean cost by 22.15 % and 12.84 %, while their error from the theoretical optimum is only around 5 %. The results demonstrate that the approximate optimality and rapid decision-making illustrate DDPG's efficient real-time scheduling capability, thereby enhancing the system's adaptability to uncertain environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call