Abstract

In recent years, the need to protect user privacy and data security as well as multi-party data cooperation has become increasingly strong. Privacy computing is considered a fundamental solution that is growing rapidly and gaining unprecedented popularity. Currently, federated learning is the most popular product in privacy computing. With its lightweight technology route and deployment, it has become a mainstream solution and product of choice for many privacy computing applications, such as providing advanced medical image collection segmentation technology in the medical field to protect patient privacy, where it can play a unique role. But Federated learning faces a need to improve communication efficiency. This paper introduces FedAvg, the most basic algorithm for federated learning, including its basic process, advantages and disadvantages. We analyze algorithms that optimize federated learning by adding local computation, Communication Compression (including model Compression, Periodic Averaging and Quantization Compression, and gradient Compression), computation and communication overlap, and compare the strengths and weaknesses of representative algorithms for each approach. At the end of this paper, the summary and prospect are made.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call