Abstract

Time series classification and retrieval are two important tasks of time series analysis. Existing methods solve these two tasks separately, which ignores the sharable information among different tasks. In this paper, we propose a deep multi-task representation learning method (MTRL) for time series classification and retrieval, which exploits both related supervised and unsupervised information. Specifically, supervised representation learning for classification task tries to maximize the inter-class variations and minimize the intra-class variations. Unsupervised representation learning for retrieval task aims at preserving pairwise dynamic time warp (DTW) distances. These two tasks can benefit from each other via shared networks, which consist of deep wavelet decomposition networks and residual networks. These networks can extract the information hidden in different time and frequency domains, and can achieve easier information flow from the lowest level to the highest level than traditional convolutional neural networks. Furthermore, we propose a distance-weighted sampling strategy, which focuses on the more discriminative samples to achieve high convergence speed and accuracies. Extensive experiments on UCR datasets demonstrate that MTRL outperforms the state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call