Abstract
Transformer-based forecasting methods have been widely applied to forecast long-term multivariate time series, which achieves significant improvements on extending the forecasting time. However, their performance can degenerate terribly when abrupt trend shift and seasonal fluctuation arise in long-term time series. Hence, we identify two bottlenecks of previous Transformers architecture: (1) the robustless decomposition module and (2) trend shifting problem. These result in a different distribution between the trend prediction and ground truth in the long-term multivariate series forecasting. Towards these bottlenecks, we design Robformer as a novel decomposition-based Transformer, which consists of three new inner module to enhance the predictability of Transformers. Concretely, we renew the decomposition module and add a seasonal component adjustment module to tackle the unstationarized series. Further, we propose a novel inner trend forecasting architecture inspired by polynomial fitting method, which outperforms previous design in accuracy and robustness. Our empirical studies show that Robformer can achieve 17% and 10% relative improvements than state-of-the-art Autoformer and FEDformer baselines under the fair long-term multivariate setting on six benchmarks, covering five mainstream time series forecasting applications: energy, economics, traffic, weather, and disease. The code will be released upon publication.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.