Abstract

Federated learning is a kind of distributed machine learning, due to the limited communication resources of Internet of things devices, it is very inefficient to train various neural network models through federated learning (FL) mode. In this paper, we propose two new efficient federated learning algorithms: static quantization federated average algorithm (SQFedAvg) and dynamic quantization federated average algorithm (DQFedAvg). They are composed of two parts, which are optimized locally on the client side, and then the updated parameters are quantified and sent to the server, thus reducing computational storage and communication costs and speeding up the training process. Finally, two models: convolutional neural network (CNN) model, a two-layer perceptron (2NN) model and two standard detection data sets (CIFAR-10, MINIST) are used to verify the effectiveness and efficiency of the proposed federated learning algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.