Abstract

Optimization of thermostatically controlled loads, such as heat pumps, using data-driven models can significantly reduce domestic energy consumption besides providing critical grid services. However, these data-driven models often require a prohibitive amount of data before reaching sufficient accuracy for individual devices. Centralized or collaborative learning schemes, which aggregate data from many devices, can lower data requirements from individual devices, but at the cost of loss of user privacy (or data leakage). In this paper, we explore federated learning as a modelling alternative to address these concerns, and compare its accuracy against collaborative learning approaches using a real-world dataset. Some important insights emerge from this work. Notably, we show that federated learning, on its own, suffers from several drawbacks when compared against collaborative approaches; including poor convergence in low data availability regimes, as well as a failure to learn causal links even asymptotically. We explore the reasons for these shortcomings, and demonstrate that these issues can be resolved by incorporating domain-informed data augmentation in the learning process, allowing it to converge to a solution that is very close to the baseline collaborative model in terms of both accuracy and interpretability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call