Abstract

Federated learning enables local devices to jointly train the server model while keeping the data decentralized and private. In federated learning, all local data should be annotated by alternative labeling techniques since the annotator in the server cannot access the data. Therefore, it is hardly guaranteed that they are correctly annotated. Under this noisy label setting, local models form inconsistent class decision boundaries with one another, and their weights severely diverge, which are serious problems in federated learning. To solve these problems, we introduce a novel federated learning scheme that allows the server to cooperate with local models by interchanging class-wise centroids. The server aligns the class-wise centroids, which are central features of local data on each device, and broadcasts aligned centroids to selected clients every communication round. Updating local models with the aligned centroids helps us to form consistent class decision boundaries among local models, although the noise distributions in clients’ data are different from each other. Furthermore, we introduce a sample selection approach to filter out data with noisy labels and a label correction method to adjust the labels of noisy instances. Our experimental results show that our approach is noticeably effective in federated learning with noisy labels.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.