Abstract

The FedFTG plug-in can effectively solve the problem of knowledge forgetting caused by the server-side direct aggregation model in Federated Learning. But FedFTG runs the risk of compromising customer privacy, as well as additional transmission costs. Therefore, this paper introduces methods to enhance the privacy and communication efficiency of FedFTG, they are: Mixing Neural Network Layers method which can avoid various kinds of inference attack, Practical Secure Aggregation strategy which uses cryptography to encrypt transmitted data; The Federated Dropout model which focuses on reducing the downward communication pressure, and the Deep Gradient Compression method that can substantially compress the gradient. Experimental results show that, MixNN can ensure the privacy protection without affecting the accuracy of the model; Practical Secure Aggregation saves the communication cost when dealing with large data vector size while protecting the privacy; Federated Dropout reduces communication consumption by up to 28; DGC can compress the gradient by 600 while maintaining the same accuracy. Therefore, if these methods are used in FedFTG, its privacy and communication efficiency will be greatly improved, which will make distributed training more secure and convenient for users, and also make it easier to realize joint learning training on mobile devices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call