Abstract

Federated learning is an emerging technology that enables multiple clients to cooperatively train an intelligent diagnostic model while preserving data privacy. However, federated diagnostic models still suffer from a performance drop when applied to entirely unseen clients outside the federation in practical deployments. To address this issue, a Federated Distillation Domain Generalization (FDDG) framework is proposed for machinery fault diagnosis. The core idea is to enable individual clients to access multi-client data distributions in a privacy-preserving manner and further explore domain invariance to enhance model generalization. A novel diagnostic knowledge-sharing mechanism is designed based on knowledge distillation, which equips multiple generators to augment fake data during the training of local models. Based on generated data and real data, a low-rank decomposition method is utilized to mine domain invariance, enhancing the model's ability to resist domain shift. Extensive experiments on two rotating machines demonstrate that the proposed FDDG achieves a 3% improvement in accuracy compared to state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.