With the development of the power internet of things, the traditional centralized computing pattern has been difficult to apply to many power business scenarios, including power load forecasting, substation defect detection, and demand-side response. How to perform efficient and reliable machine learning tasks while ensuring that user data privacy is not violated has attracted the attention of the industry. Blockchain-based federated learning (FL), proposed as a new decentralized and distributed learning framework for building privacy-enhanced IoT systems, is receiving more and more attention from scholars. The framework provides some advantages, including decentralization, scalability, and data privacy, but at the same time its consensus mechanism consumes a significant amount of computational resources. Moreover, the number of model parameters has increased dramatically, leading to an increasing amount of transmitted data and a vast communication overhead. To reduce the communication overhead, we propose an FL framework in the directed acyclic graph (DAG)-based blockchain system, which achieves efficient and trusted sharing of FL networks. We design an adaptive model compression method based on k-means to compress the FL model and reduce the communication overhead of each round in FL training. Meanwhile, the original accuracy-based tips selection algorithm is optimized, and a tips selection algorithm considering multi-factor evaluation is proposed. Simulation experimental results show that the method proposed in this paper reduces the total bytes of communication of the blockchain-based federated learning system while balancing the accuracy of the FL model compared to previous work.
Read full abstract