Abstract

Abstract Multi-scale convolutional neural network structures consisting of parallel convolution paths with different kernel sizes have been developed to extract features from multiple temporal scales and applied for fault diagnosis of rotating machines. However, when the extracted features are used to the same extent regardless of the temporal scale inside the network, good diagnostic performance may not be guaranteed due to the influence of the features of certain temporal scale less related to faults. Considering this issue, this paper presents a novel architecture called a multi-scale path attention residual network to further enhance the feature representational ability of a multi-scale structure. Multi-scale path attention residual network adopts a path attention module after a multi-scale dilated convolution layer, assigning different weights to features from different convolution paths. In addition, the network is composed of a stacked multi-scale attention residual block structure to continuously extract meaningful multi-scale characteristics and relationships between scales. The effectiveness of the proposed method is verified by examining its application to a helical gearbox vibration dataset and a permanent magnet synchronous motor current dataset. The results show that the proposed multi-scale path attention residual network can improve the feature learning ability of the multi-scale structure and achieve better fault diagnosis performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call