Abstract

Federated Learning is a distributed machine learning paradigm which advocates training on decentralized data. However, developing a model centrally involves huge communication/computation overhead, and presents a bottleneck. We propose a method that overcomes this problem while maintaining the privacy of the participants and the classification accuracy. Our method achieves significant speedups compared to existing methods that employ Homomorphic Encryption. Even pessimistically, we achieve a speedup of 4.81x for classification on the ImageNet dataset with an AlexNet architecture, without compromising the privacy of the participants and the accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call