Nowadays, multi-energy systems are receiving special attention from smart grid community owing to their high flexibility potentials integrating with multiple energy carriers. In this regard, energy hub is known as a flexible and efficient platform to supply energy demands with an acceptable range of affordability and reliability by relying on various energy production, storage and conversion facilities. Given the increasing penetration of renewable energy sources to promote a low-carbon energy transition, accurate economic and environmental assessment of energy hub, along with the real-time automatic energy management scheme has become a challenging task due to the high variability of renewable energy sources. Furthermore, the conventional model-based optimization approach requiring full knowledge of the employed mathematical operating models and accurate uncertainty distributions may become impractical for real-world applications. In this context, this paper proposes a model-free safe deep reinforcement learning method for the optimal control of a renewable-based energy hub operating in multiple energy carries while satisfying the physical constraints within the energy hub operation model. The main objective of this work is to minimize the system energy cost and carbon emission by considering various energy components. The proposed deep reinforcement learning method is trained and tested on a real-world dataset to validate its superior performance in reducing energy cost, carbon emission, and computational time with respect to the state-of-the-art deep reinforcement learning and optimized-based approaches. Moreover, the effectiveness of the proposed method in dealing with model operation constraints is evaluated on both training and test environments. Finally, the generalization performance for the learnt energy management scheme as well as the sensitivity analysis on storage flexibility and carbon price are also examined in the case studies.
Read full abstract