Abstract
Accurately predicting the remaining useful life (RUL) has become crucial for ensuring stable and safe operations for rocket engines due to the extreme working environment. However, current RUL prediction approaches based on convolution and recurrent frameworks lack effective feature extraction methods to model long-term dependencies, resulting in limited accuracy and generalizability. To address this issue, we propose an end-to-end temporal Transformer with autocorrelated attention mechanism augmented for RUL prediction of turbopump bearings. The Transformer module is adopted as the backbone of proposed framework to model long-term dependencies from the raw signals. To further enhance predictive capability, we develop a self-attention mechanism based on autocorrelation calculation. This mechanism extracts and aggregates feature representations through similarity comparison at the sub-series level. Furthermore, we utilize convolutional layers with residual links to capture internal detail features, compensating for the limitations of capturing local information. The proposed framework is evaluated through a life-cycle rocket engine bearing dataset and the experimental results demonstrate the effectiveness and superiority on RUL prediction.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have