Abstract

Intelligent fault diagnosis models based on transfer learning achieve cross-domain fault identification under small training samples. Existing cross-domain diagnosis models assume the source and the target data share the health condition label space. However, new fault types will occur in the application stage, and it is usually defined as an open-set fault diagnosis problem. In this study, a one-stage self-supervised momentum contrastive learning model (OSSMCL) is proposed for open-set cross-domain fault diagnosis. We propose a self-supervised contrastive learning based momentum encoder to capture the distinguishable features between sample pairs. Different from common self-supervised pipelines, the proposed method is a one-stage framework fusing a meta-learning paradigm, through which OSSMLC can learn to identify new faults with a few labeled samples in the target domain. The one-stage framework can achieve cross-domain fault diagnosis directly without network fine-tuning, and it will reduce the risk of overfitting under limited target data. The effectiveness of the proposed model is evaluated by three open-set cross-domain fault diagnosis experiments. Compared with the state-of-the-art methods, the proposed model can obtain higher diagnostic accuracy in the open-set scenario with small training samples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call