Self-supervised learning (SSL) has been widely researched in recent years. In Particular, generative self-supervised learning methods have achieved remarkable success in many AI domains, such as MAE in computer vision, well-known BERT, GPT in natural language processing, and GraphMAE in graph learning. However, in the context of time series analysis, not only is the work that follows this line limited but also the performance has not reached the potential as promised in other fields. To fill this gap, we propose a simple and elegant masked autoencoder for time series representation learning. Firstly, unlike most existing work which uses the Transformer as the backbone, we build our model based on neural ordinary differential equation which possesses excellent mathematical properties. Compared with the position encoding in Transformer, modeling the evolution patterns continuously could better extract the temporal dependency. Secondly, a timestamp-wise mask strategy is provided to cooperate with the autoencoder to avoid bias, and it also could reduce the cross-imputation between variables to learn more robust representations. Lastly, extensive experiments conducted on two classical tasks demonstrate the superiority of our model over the state-of-the-art ones.
Read full abstract