Abstract

In real-world scenarios, partial information losses of multivariate time series degrade the time series analysis. Hence, the time series imputation technique has been adopted to compensate for the missing values. Existing methods focus on investigating temporal correlations, cross-variable correlations, and bidirectional dynamics of time series, and most of these methods rely on recurrent neural networks (RNNs) to capture temporal dependency. However, the RNN-based models suffer from the common problems of slow speed and high complexity when dealing with long-term dependency. While some self-attention-based models without any recurrent structures can tackle long-term dependency with parallel computing, they do not fully learn and utilize correlations across the temporal and cross-variable dimensions. To address the limitations of existing methods, we propose a novel so-called dual-branch cross-dimensional self-attention-based imputation (DCSAI) model for multivariate time series, which is capable of performing global and auxiliary cross-dimensional analyses when imputing the missing values. In particular, this model contains masked multi-head self-attention-based encoders aligned with auxiliary generators to obtain global and auxiliary correlations in two dimensions, and these correlations are then combined into one final representation through three weighted combinations. Extensive experiments are presented to show that our model performs better than other state-of-the-art benchmarkers on three real-world public datasets under various missing rates. Furthermore, ablation study results demonstrate the efficacy of each component of the model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call