Abstract

The last two years have seen groundbreaking advances in natural language processing (NLP) with the advent of applications like ChatGPT, Codex, and ChatSonic. This revolution is powered by the development of cutting-edge transformer models that leverage multiheaded attention mechanisms, positional encoding, and highly efficient transfer learning capabilities. Despite these remarkable advances, there is still work to be done to fully realize the practical applicability of transformers in chemical systems. Thus, we are excited to present our latest work, which highlights the immense potential of transformers for non-trivial multivariate time-series prediction tasks with high-value implications in process monitoring, control, and optimization. Specifically, impressive prediction capabilities of first-generation time-series transformers (TSTs) were demonstrated by developing, testing, and comparing TSTs with existing models. Further, the practical applicability of TSTs was highlighted by developing a first-of-a-kind TST-based model predictive controller (MPC). More importantly, the current work provides a concrete foundation for exploring promising new directions, such as the development of large-scale TSTs leveraging transfer learning for modeling of new process equipment, and plant-level multisource aggregative cognitive models for fault prognosis and prevention. We are excited to see what the future holds as we continue to push the boundaries of what is possible with these ‘transformer-tive’ technologies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call