Abstract

Federated learning is an emerging technology that enables multiple clients to cooperatively train an intelligent diagnostic model while preserving data privacy. However, federated diagnostic models still suffer from a performance drop when applied to entirely unseen clients outside the federation in practical deployments. To address this issue, a Federated Distillation Domain Generalization (FDDG) framework is proposed for machinery fault diagnosis. The core idea is to enable individual clients to access multi-client data distributions in a privacy-preserving manner and further explore domain invariance to enhance model generalization. A novel diagnostic knowledge-sharing mechanism is designed based on knowledge distillation, which equips multiple generators to augment fake data during the training of local models. Based on generated data and real data, a low-rank decomposition method is utilized to mine domain invariance, enhancing the model's ability to resist domain shift. Extensive experiments on two rotating machines demonstrate that the proposed FDDG achieves a 3% improvement in accuracy compared to state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call