Abstract
With the advancements in renewable energy and energy storage technologies, the energy hubs (EH) have been emerging in recent years. The scheduling of EH is a challenging task due to the need to incorporate uncertainties at energy supply and load side. The existing model-based optimization approaches proposed for the above purpose have limitations in terms of solution accuracy and computational efficiency, which makes hinders their applications. This paper proposes a model-free, safe deep reinforcement learning (DRL) approach, using primal-dual optimization and imitation learning, for optimal scheduling of an EH that includes a tri-generative advanced adiabatic compressed air energy storage (AA-CAES). First, the operation of an AA-CAES under off-design conditions is modeled and linearized using a mixed-integer linear programming (MILP). Then, a safe DRL approach is proposed with training and testing steps considering a case study. The performance of the proposed approach in reducing operational cost and satisfying constraints is compared to the state-of-the-art DRL algorithms as well as a deterministic MILP approach. In addition, the generalization of the proposed approach is examined on a test-set. Finally, the effect of off-design conditions of a tri-generative AA-CAES on the optimal dispatch strategy is investigated. The results indicate that the proposed approach can effectively reduce the operational cost and satisfy the operational constraints.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.