Abstract
Federated learning, as a distributed machine learning framework that can protect data privacy, has recently received increasing attention. In federated learning, a shared global model is obtained through parameter interaction, which leads to frequent parameter communication during the training process. The limited bandwidth of IoT and edge devices that deploy federated learning further affects communication and learning efficiency. In this paper, an enhanced federated learning technique is presented by proposing a feature-aligned filter selection method. Besides, it is believed that the training gap between the global model and the local model on each node should be focused on during the training process. Then, we can select the local contribution parameters to improve the communication efficiency while ensuring the performance of the shared global model. Therefore, the Geometric Median of each layer in the global model is adopted as the criterion to select important filters in the local model, and then the parameters interact with other nodes to achieve efficient communication. The results under a variety of experimental settings demonstrate that our proposed federated learning scheme can effectively enhance communication efficiency and ensure the performance of the global model. Compared with the state-of-the-art methods, a maximum of 6.5× improvements in communication efficiency can be obtained.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.