Abstract

Long-term remaining useful life (RUL) prediction is essential for the maintenance of safety-crucial engineering assets. Deep learning (DL) models, especially Transformer-based models have achieved outstanding performance in long-term RUL prediction. However, existing Transformer models neglect the impact of discrepancy loss in model training. The accumulation of the discrepancy loss during the inference will hamper the generalization of prediction model, resulting in an overfitting problem. To address the problem, this paper proposes a Bayesian Adversarial Probsparse Transformer (BAPT) model for long-term RUL prediction. Firstly, the adversarial learning method is leveraged to mitigate the impact of accumulated discrepancy loss caused by varying working conditions in long-term prediction, thus diminishing the error accumulation. Secondly, the Probsparse multi-head attention is adopted to enhance the efficiency of feature extraction. The Probsparse multi-head attention focuses on the significant degradation features in long time-series to reduce the computation complexity. Lastly, the Bayesian neural network is introduced to quantify the uncertainty in RUL prediction. The effectiveness of the proposed model is verified using two commercial aircraft turbofan engine datasets. The results indicate that BAPT model for long-term RUL prediction demonstrates better performance than the existing state-of-the-art models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call