Abstract
Seasonal thermal energy storage systems (STESSs) can shift the delivery of renewable energy sources and mitigate their uncertainty problems. However, to maximize the operational profit of STESSs and ensure their long-term profitability, control strategies that allow them to trade on wholesale electricity markets are required. While control strategies for STESSs have been proposed before, none of them addressed electricity market interaction and trading. In particular, due to the seasonal nature of STESSs, accounting for the long-term uncertainty in electricity prices has been very challenging. In this article, we develop the first control algorithms to control STESSs when interacting with different wholesale electricity markets. As different control solutions have different merits, we propose solutions based on model predictive control and solutions based on reinforcement learning. We show that this is critical since different markets require different control strategies: MPC strategies are better for day-ahead markets due to the flexibility of MPC, whereas reinforcement learning (RL) strategies are better for real-time markets because of fast computation times and better risk modeling. To study the proposed algorithms in a real-life setup, we consider a real STESS interacting with the day-ahead and imbalance markets in The Netherlands and Belgium. Based on the obtained results, we show that: 1) the developed controllers successfully maximize the profits of STESSs due to market trading and 2) the developed control strategies make STESSs important players in the energy transition: by optimally controlling STESSs and reacting to imbalances, STESSs help to reduce grid imbalances.
Highlights
W HILE the energy transition [1] has the potential to highly improve our society, e.g., by mitigating climate change, it poses some potential problems that need to be tackled [2]
We show that this is critical since different markets require different control strategies: model predictive control (MPC) strategies are better for day-ahead markets due to the flexibility of MPC, whereas reinforcement learning (RL) strategies are better for real-time markets because of fast computation times and better risk modeling
We assess the merits of each control solution for the different markets and show that, while MPC-based methods are most suitable for day-ahead markets, RL-based methods perform better when trading in the imbalance market
Summary
W HILE the energy transition [1] has the potential to highly improve our society, e.g., by mitigating climate change, it poses some potential problems that need to be tackled [2]. Due to the weather dependence of renewable sources, a large integration of renewables implies more uncertain energy generation. In the case of electricity, as generation and consumption have to be balanced at all times, the more renewable sources are integrated, the more imbalances between generation and consumption occur, and the more complex the control and balance of the electrical grid becomes [3]. In this context, energy storage systems offer a promising solution for uncertain generation by providing flexibility and ancillary services, leading to smooth and reliable grid operation [4]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.