Abstract

Sensors equipped on the high-speed train provide large amounts of data which contributes to its state monitoring. However, it is challenging to distinguish whether the fault originates from the mechanical component or the sensors themselves. The main difficulties lie in the biased amount of normal and fault data as well as the deficiency of multi-source data’s inherent correlation. In this paper, we propose a Bayesian convolutional neural networks (CNN)-based fusion framework to enhance the ability to identify sensor errors. The framework utilizes wavelet time–frequency maps to extract abnormal features, employs a Bayesian CNN to obtain spatial features from a single sensor, integrates multi-source features via bidirectional long short-term memory network and enhances the acquired spatial and temporal features using an attention mechanism. The enhanced information finally generated leads to precise identification of the sensor faults. The proposed feature-level fusion framework and the associated attention mechanism facilitate discovering the inherent correlation and filtering of irrelevant information. Results indicate that our proposed method achieves 95.4% in terms of accuracy, which outperforms methods relying on feature extraction with single-source sensors by 7.8%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call