Abstract

Semi-supervised learning is a powerful machine learning method. It can be used for model training when only part of the data are labeled. Unlike discrete data, time series data generally have some temporal relation, which can be considered as a supervised signal in semi-supervised learning to supervise the learning of unlabeled time series data. However, the currently known semi-supervised time series classification (TSC) methods always ignore or under-explore the temporal relation structure and fail to fully use the unlabeled time series data. Therefore, we propose a Semi-supervised Time Series Classification Model with Self-supervised Learning (SSTSC). It takes self-supervised learning as the auxiliary task and jointly optimizes it with the main TSC task. Specifically, it performs the TSC task on the labeled time series data; For the unlabeled time series data, it splits the “past-anchor-future” segments and constructs the positive/negative temporal relation samples with different combinations to accurately predict the temporal relations and capture the higher-quality semantic context in self-supervised learning as a supervised signal for TSC task. Experimental results demonstrate that SSTSC has better effects than the baselines from different perspectives.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call