Recovering missing values plays a significant role in time series tasks in practical applications. How to replace the missing data and build the dependency relations from the incomplete sample set is still a challenge. The previous research has found that residual network (ResNet) helps to form a deep network and cope with degradation problem by shortcut connection. Gated recurrent unit (GRU) can improve network model and reduce training parameters by update gate which takes the place of forgetting gate and output gate in long short-term memory (LSTM). Inspired by this finding, we observe that shortcut connection and mean of global revealed information can model the relationship among missing items, the previous and overall revealed information. Hence, we design an imputation network with decay factor for shortcut connection and mean of the global revealed information in GRU, called decay residual mean imputation GRU (DRMI-GRU). We introduce a decay residual mean unit (DRMU), which takes full advantage of the previous and global revealed information to model incomplete time series; and the decay factor is applied to balance the previous long-term dependencies and all non-missing values in the sample set. In addition, a mask unit is designed to check the missing data existing or not. An extensive body of empirical comparisons with other existing imputation algorithms over real-world data and public dataset with different ratio of missing data verifies the performance of our model.
Read full abstract