Abstract

Abstract Aiming at the problems of traditional fault diagnosis methods that do not represent the time correlation between signals, low recognition accuracy under complex working conditions and noise interference and too many parameters, a bearing fault diagnosis method based on mixed attention mechanism (MAM) and deep separable dilated convolution neural network (DSDCNN) is proposed. Firstly, a Markov transfer field encoding method is used to transform the original one-dimensional vibration signal into a two-dimensional feature image with temporal correlation. Secondly, a deep separable convolution algorithm is presented by taking advantage of the low computational complexity of deep separable convolution and the ability of dilated convolution to expand the receptive field under the condition of invariable number of parameters. Then, the MAM is designed to make the model capture the feature dependency of the feature map in spatial and channel dimensions, and the MAM-DSDCNN model is constructed. Finally, the fault diagnosis performance of the proposed model is verified with two different data sets. The results show that the average recognition accuracy of MAM-DSDCNN reaches 99.63% under variable load conditions, 99.42% under variable speed conditions, 94.26% under noisy environment with the signal-to-noise of 0 dB, which prove that the model has higher recognition accuracy, stronger generalization and noise immunity performance than other deep learning algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.