Abstract

This paper presents a new privacy-preserving framework for the short-term (multi-horizon) probabilistic forecasting of nodal voltages in local energy communities. This task is indeed becoming increasingly important for cost-effectively managing network constraints in the context of the massive integration of distributed energy resources. However, traditional forecasting tasks are carried out centrally, by gathering raw data of end-users in a single database that exposes their private information. To avoid such privacy issues, this work relies on a distributed learning scheme, known as federated learning wherein individuals’ data are kept decentralized. The learning procedure is then augmented with differential privacy, which offers formal guarantees that the trained model cannot be reversed-engineered to infer sensitive local information. Moreover, the problem is framed using cross-series learning, which allows to smoothly integrate any new client joining the community (i.e., cold-start forecasting) without being plagued by data scarcity. Outcomes show that the proposed approach achieves improved performance compared to non-collaborative (locally trained) models, and is able to reach a trade-off between privacy and performance for different architectures of deep learning networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call