Abstract

Recently, as a representative of deep learning methods, Transformers have shown great prowess in intelligent fault diagnosis, offering powerful feature extraction and modeling. However, their high computational demand and low robustness limit industrial application. Therefore, this paper proposes an innovative Neural-Transformer to realize high-precision robust fault diagnosis with low computational cost. First, a two-dimensional representation method, the frequency-slice wavelet transform (FSWT), is introduced to reflect the dynamic characteristics and frequency component variations of signals, enhancing the fault identifiability of vibration signals. Second, a separable multiscale spiking tokenizer (SMST) is developed to project time-frequency input of multiple scales to spike features with a fixed patch, ensuring consistency in feature extraction and improving the recognizability of specific frequencies in mechanical faults. Subsequently, a multi-head spatiotemporal spiking self-attention (MHSSSA) mechanism is constructed, which abandons the cumbersome multiplication operations with high computational costs and can also focus on key fine-grained time-frequency features in a global range. Experimental cases validate the advantages of the Neural-Transformer in comparison to baseline methods and state-of-art methods on one public dataset and two real-world datasets. In particular, the proposed method only consumes 0.65mJ of energy to achieve an optimal diagnostic accuracy of 93.14% on real-world dataset.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.