Abstract

As an essential development direction of energy internet, integrated energy system with interdisciplinary techniques is of great significance to promote multi-energy cooperation, realize low-carbon economic operation, and improve the flexible scheduling potential. This paper presents the study of an integrated power, heat and natural-gas system consisting of energy coupling units and wind power generation interconnected via a power grid. A deep reinforcement learning –based energy scheduling strategy is proposed to optimize multiple targets, including minimizing operational costs and ensuring power supply reliability. By scheduling the output of energy units, the economy and reliability of the considered system are improved. Taken diversified uncertainties into account, like intermittent of wind power and flexibility of load demand, the stochastic dynamic optimization problem is modeled as Markov decision process, and a soft actor-critic algorithm is introduced to solve the complex scheduling problem. The optimized decision-making action can be identified by the soft actor-critic algorithm through empirical learning without prediction information and prior knowledge. In the simulation, the proposed SAC-based agent has robust performance on solving optimization problems of different scenarios. Besides, the comparison study is carried out among benchmark reinforcement learning and heuristic algorithms, parameters of which are specifically given. The results demonstrate that when optimizing the comprehensive profits, the developed strategy reduced costs by up to 21.66%, compared to other algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call