Federated Learning (FL) promises to solve the data privacy problem by training the local model on each node and sharing the model parameters instead of the data itself. Next, the FL server applies model aggregation techniques to aggregate the received models and broadcast the resulting model to the connected clients. This study proposes a Digital Twin-based Federated Learning (DT-FL) framework to virtually monitor and controls the remotely deployed physical clients and their training process. The connection-oriented protocol, Open Connectivity Foundation (OCF) Iotivity, connects the FL clients with the FL server to ensure packet delivery. OCF Iotivity sends/receives the models’ weights to/from the server, and Hyper Text Transfer Protocol (HTTP) is used to monitor clients’ local training. After receiving partially trained models from clients, the server performs the optimal model selection using the normal distribution method by considering the performance of the model. Finally, the best-selected models are aggregated, and the final model is broadcasted to the clients. The framework utilizes Raspberrypi 4 devices as clients with limited computational capabilities, due to which the experiments are conducted with structured energy consumption data. The dataset comprises of 8 multistory residential buildings located in different geographical locations of the Republic of Korea. Each residential building is treated as an FL client and registered on DT using the IP address and port number. The DT-FL framework can be used with classification and regression datasets, and the model architecture for that data can be designed on the DT platform. The experiments are conducted with the partial and full participation of clients. The results show the minimum delay time in physical and virtual object synchronization and better performance and generalization of the global model for each client. The source code of the proposed DT-FL framework is available on GitHub.
Read full abstract