Abstract

Representation learning of multivariate time series is a significant and challenging task, which is helpful in various tasks such as time series data search, trend analysis, and forecasting. In practice, unsupervised learning is strongly preferred owing to sparse labeling. Most existing studies focus on the representation of independent subseries and do not take into consideration the relationships among different subseries. In certain situations, this may lead to failure of downstream tasks. This study proposes an unsupervised representation learning model for multivariate time series by considering high-level semantics. Specifically, we introduce the covariance calculated by the Gaussian process to the self-attention mechanism to reveal the high-level semantics features of the subseries. Additionally, we design a novel unsupervised method to learn the representation of multivariate time series. Moreover, to deal with the challenge of variable lengths of input subseries of multivariate time series, a temporal pyramid pooling (TPP) method is applied to construct input vectors with equal length. The experimental results show that our model has substantial advantages compared with other semantic-based representation learning models and can be well applied in various downstream tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call