Abstract

Meaningful feature extraction from multivariate time-series data is still challenging since it takes into account the correlation between pairs of sensors as well as the temporal information of each time-series. Meanwhile, the huge industrial system has evolved into a data-rich environment, resulting in the rapid development and deployment of deep learning for machine RUL prediction. RUL (Remaining Useful Life) examines a system's behavior over the course of its lifetime, that is, from the last inspection to when the system's performance deteriorates beyond a certain point. RUL has been addressed using Long-Short-Term Memory (LSTM) and Convolution Neural Network (CNN), particularly in complex tasks involving high-dimensional nonlinear data. The main focus, however, has been on degradation data. In 2021, a new realistic run-to-failure turbofan engine degradation dataset was released, which differs significantly from the simulation dataset. The key difference is that each cycle's flight duration varies, so the existing deep technique will be ineffective at predicting the RUL for real-world degradation data. We present a Self-Attention Transformer-Based Encoder model to address this problem. The encoder with the time-stamp encoder layer works in parallel to extract features from various sensors at various time stamps. Self-attention enables efficient processing of extended sequences and focuses on key elements of the input time series. Self-attention is used in the proposed Transformer model to access global characteristics from diverse time-series representations. Under real-world flight conditions, we conduct tests on turbofan engine degradation data using variable-length input. The proposed approach for estimating RUL of turbofan engines appears to be efficient based on empirical results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.