Abstract

Deep transfer learning has been extensively developed in the remaining useful life prediction of rolling bearings because it can decrease the dependence on massive labeled data and robustly extract adaptive features. Most existing studies perform remaining useful life predictions across different working conditions within the same machine. Therefore, these methods are unsuitable for data with severe distribution drifts. Only a few studies investigate the remaining useful life prediction across different devices, but their success depends on the prerequisite that a large amount of labeled data exists. In brief, the models' applicabilities and prediction performances are significantly constrained toward industrial scenarios. To address these drawbacks, a multi-source adversarial online knowledge distillation approach is proposed for rolling bearing remaining useful life prediction across machines. A dual knowledge transfer mechanism, including multi-level domain adaptation and online knowledge distillation with dynamic weighting, is designed in the ensemble learning architecture. The proposed approach allows the more comprehensive extraction and transition of prognostic knowledge of multiple working conditions across different machines, thereby improving the performance of remaining useful life prediction. Experiments on two open-access rolling bearing datasets demonstrate the effectiveness and superiority of the proposed approach in terms of prediction accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call