Abstract
Abstract Deep learning is an appropriate methodology for modeling complex industrial data in the field of soft sensors, owing to its powerful feature representation capability. Given the nonlinear and dynamic nature of the process industry, the key challenge for soft sensor technology is to effectively mine dynamic information from long sequences and accurately extract features of relevance to quality. A dual temporal attention mechanism-based convolutional long short-term memory network (DTA-ConvLSTM) under an encoder-decoder framework is proposed as a soft sensor model to acquire quality-relevant dynamic features from serial data. Considering different influences of process variables for prediction at multiple time steps and various locations, ConvLSTM and temporal self-attention mechanism are utilized as the encoder to adaptively fuse spatiotemporal features and capture long-term dynamic properties of process in order to capture the trends of industrial variables. Furthermore, a quality-driven temporal attention mechanism is employed throughout the decoding process to dynamically select relevant features to more accurately track quality changes. The encoder-decoder model meticulously analyses the interactions between process and quality variables by incorporating dual-sequence dynamic information to improve the prediction performance. The validity and superiority of the DTA-ConvLSTM model was validated on two industrial case studies of the debutanizer column and sulfur recovery unit. Compared to the traditional LSTM model, the proposed model demonstrated a substantial improvement with the accuracy R2 up to 97.3% and 94.9% and the root mean square error reducing to 0.122 and 0.022.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.