Abstract

Occupants’ behavior is a major source of uncertainty for the optimal operation of building energy systems. The highly stochastic hot water use behavior of occupants has led to conservative operational strategies for hot water systems, that try to ensure occupants’ comfort by following energy-intensive operational approaches. Intending to integrate the occupants’ behavior into hot water systems control, this study proposes a control framework based on Reinforcement Learning, which can learn the stochastic occupants' behavior and make a balance between opposing objectives of water hygiene, comfort, and energy use. A model-free approach is implemented to ensure transferability. To achieve fast convergence on the target house while being model-free, this study proposes an offline training procedure integrating a stochastic hot water use model to mimic the hot water use behavior of occupants. The proposed framework is then evaluated using an actual hot water use and weather dataset collected over 29 weeks from a residential house in Switzerland. The performance of the proposed control framework is compared to the conventional rule-based controller as the common practice in hot water systems. While the hot water use dataset has been collected during the COVID-19 pandemic, with an abnormal schedule of occupants, results indicate that the proposed control framework could successfully learn and adapt to the occupants' behavior and achieve 23.8% of energy saving, while maintaining the occupants' comfort and water hygiene. The adaptive nature of the proposed control framework shows significant potential in reducing the discrepancy between supply and demand in hot water systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call