Abstract

Time-series classification approaches based on deep neural networks easily overfit UCR datasets, which is caused by the few-shot problem of those datasets. Therefore, to alleviate the overfitting phenomenon to further improve accuracy, we first propose label smoothing for InceptionTime (LSTime), which adopts the soft label information compared to only hard labels. Next, instead of manually adjusting soft labels by LSTime, knowledge distillation for InceptionTime (KDTime) is proposed to automatically generate soft labels by the teacher model while compressing the inference model. Finally, to rectify the incorrectly predicted soft labels from the teacher model, knowledge distillation with calibration for InceptionTime (KDCTime) is proposed, which contains two optional calibrating strategies, i.e., KDC by translating (KDCT) and KDC by reordering (KDCR). The experimental results show that the KDCTime accuracy is promising, while its inference time is orders of magnitude faster than state-of-the-art approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call