Abstract
This paper presents a novel data-driven approach for short-term operational planning of a cogeneration plant. The proposed methodology utilizes sparse identification of nonlinear dynamics (SINDy) to extract a dynamic model of heat generation from operational data. This model is then employed to simulate the plant dynamics during the training of a reinforcement learning (RL) agent, enabling online stochastic optimization of the production plan in real-time. The incorporation of SINDy enhances the accuracy of capturing the plant's nonlinear dynamics and significantly improves the computational speed of plant simulations, enabling efficient RL agent training within a reasonable timeframe. The performance of operational planning with the RL agent is compared to that of dynamic programming, a widely used method in the literature. The evaluation metric encompasses energy efficiency, unmet demands, and wasted heat. The comparison investigates the effectiveness of RL and dynamic programming under various scenarios with different qualities of energy demand forecasts. The RL agent exhibits robustness and notably improves the operational planning performance, particularly when faced with uncertain energy demands in the environment. Furthermore, the findings show that the RL agent, trained on a school building data, could successfully perform planning tasks for a hotel building, indicating the transferability of learned planning knowledge across different cogeneration use cases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.