Abstract

In recent years, Transformer has become an effective tool for fault diagnosis, but it has been shown that a sufficient amount of labeled data is usually required to train a Transformer model. However, a few labeled data can be obtained in the actual industrial process, and labeling a large quantity of training samples is costly. To reduce the demand for training labeled samples, this paper proposes a mask self-supervised learning-based Transformer (MSFormer) for bearing fault diagnosis of multistage centrifugal fans in petrochemical units under the condition of limited samples. In mask self-supervised learning, unlabeled samples can be used to mine robust representations of fault signals and potential relationships between sub-sequences to obtain a pre-trained model with well-generalized parameters. Then, a few labeled samples are utilized to fine-tune by supervised learning to enable MSFormer the discrimination ability to identify different bearing fault types. The effectiveness of the proposed method is fully validated on the multistage centrifugal fan dataset and the Case Western Reserve University motor bearing dataset. The experimental results demonstrate that MSFormer is effective in reducing the number of labeled training samples, and compared to state-of-the-art methods, MSFormer has superior diagnosis performance under the condition of limited labeled samples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call