Abstract
We develop a novel framework for personalized federated learning (PFL) utilizing a decoupled version of knowledge distillation (DKD). Unlike traditional PFL methods, the proposed PFL-DKD creates a dynamically connected network among local clients and categorizes them according to their knowledge, storage, and computational capabilities. The developed decoupling of knowledge distillation into target class (TC) and latent class (LC) enables knowledge-rich clients to efficiently transfer their expertise to knowledge-poor clients. To further enhance our innovative PFL-DKD approach, we extend it to PFL-FDKD by introducing a ”logit fusion” that seamlessly aggregates knowledge and experiences from neighboring clients. Both our theoretical analyses and extensive experiments reveal that PFL-DKD outperforms existing centralized and decentralized PFL approaches, making significant strides in mitigating the challenges associated with heterogeneous data and system configurations. The details of our implementation with the codebase are in PFL-DKD.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.