Abstract

Multivariate time series forecasting aims to predict time series data comprising several linked variables or characteristicsand is frequently used in stock forecasting, energy forecasting, etc. The tough task is to acquire furtherhistorical data to forecast future values while boosting the capacity to mine relationships between and within sequences. The existing methods neglect the weighted influence of various neighbors and relationships on the sequence instance, as well as the semantic information of the instance itself, making the inter-instance correlation measure inaccurate. Meanwhile, the length of the sequence input is limited and the variable features in the multivariate sequence are treated equally, short-term and multi-feature interference cannot be eliminated, causing the learnt time series features to deviate from the real features. In this work, we provide a Multivariate Time Series Forecasting model that emphasizes Relationships between and within sequences (MTSFR). Use the BERT model to characterize the text attributes of instances and construct basic semantic embeddings. Meanwhile, instance-level and relation-level attention are used to model topological relations among different instances. Computes the cross-correlation of multivariate sequences within instances and performs attention weighting for multivariate sequence encoding. And choose the Transformer model to realize the trend prediction of long-term multivariate series. Experimental results show that the F1 value of our approach achieves 68.50% and 74.66% under the CSI300 and S&P500 data sets respectively, both of which are superior to the SOTA technique. Furthermore, the model is suited for small-scale sequence relationship modeling and is effective at handling long-term multivariate sequence forecasting problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call