Abstract

Thermostatically controlled loads (TCLs) in buildings are ideal resources to provide regulation services for power systems. As large-scale and centralized TCLs with high efficiency and large regulation capacity, district cooling systems (DCSs) have attracted great research attention for minimizing energy costs, but little on providing regulation services. However, controlling a DCS to provide high-quality regulation services is challenging due to its complex thermal dynamic model and uncertainties from regulation signals and cooling demands. To fill this research gap, we propose a novel safe deep reinforcement learning (DRL) control method for a DCS to provide regulation services. The objective is to adjust the DCS’s power consumption to follow real-time regulation signals subject to buildings’ temperature comfort constraints. The proposed method is model-free and adaptive to uncertainties from regulation signals and cooling demands. Furthermore, the barrier function is combined with traditional DRL to construct a safe DRL controller, which can not only avoid unsafe explorations during training (this may result in catastrophic control results) but also improve training efficiency. We conducted case studies based on a realistic DCS to evaluate the performance of the proposed control method compared to traditional methods, and the results demonstrate the increased effectiveness and superiority of the proposed control method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.