Abstract

Transformer model is being gradually studied and applied in bearing fault diagnosis tasks, which can overcome the feature extraction defects caused by long-term dependencies in convolution neural network (CNN) and recurrent neural network (RNN). To optimize the structure of existing transformer-like methods and improve the diagnostic accuracy, we proposed a novel method based on the multiscale time-frequency sparse transformer (MTFST) in this paper. First, a novel tokenizer based on shot-time Fourier transform (STFT) is designed, which processes the 1D format raw signals into 2D format discrete time-frequency sequences in the embedding space. Second, a sparse self-attention mechanism is designed to eliminate the feature mapping defect in naive self-attention mechanism. Then, the novel encoder-decoder structure is presented, the multiple encoders are employed to extract the hidden feature of different time-frequency sequences obtained by STFT with different window widths, and the decoder is used to remap the deep information and connect to the classifier for discriminating fault types. The proposed method is tested in the XJTU-SY bearing dataset and self-made experiment rig dataset, and the following work is conducted. The influences of hyperparameters on diagnosis accuracy and number of parameters are analysed in detail. The weights of the attention mechanism (AM) are visualized and analysed to study the interpretability, which explains the partly working pattern of the network. In the comparison test with other existing CNN, RNN, and transformer models, the diagnosis accuracy of different methods is statistically analysed, feature vectors are presented via the t-distributed stochastic neighbor embedding (t-SNE) method, and the proposed MTFST obtains the best accuracy and feature distribution form. The results demonstrate the effectiveness and superiority of the proposed method in bearing fault diagnosis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.