Abstract Industrial robots generate monitoring data rich in sensitive information, often making enterprises reluctant to share, which impedes the use of data in fault diagnosis modeling. Dataset distillation (DD) is an effective approach to condense large dataset into smaller, synthesized forms, focusing solely on fault-related features, which facilitates secure and efficient data transfer for diagnostic purposes. However, the challenge of achieving satisfactory fault diagnosis accuracy with distilled data stems from the computational complexity in data distillation process. To address this problem, this article proposes a modified KernelWarehouse (MKW) network-based DD method to achieve accurate fault diagnosis with the distilled dataset. In this algorithm, DD first generates distilled training and testing dataset, followed by the training of an MKW-based network based on these distilled datasets. Specifically, MKW reduces network complexity through the division of static kernels into disjoint kernel cells, which are then computed as linear mixtures from a shared warehouse. An experimental study based on the real-world robotic dataset reveals the effectiveness of the proposed approach. The experimental results indicate that the proposed method can achieve a fault diagnosis accuracy of 86.3% when only trained with distilled data.
Read full abstract