Abstract

We propose to embed time series in a latent space where pairwise Euclidean distances (EDs) between samples are equal to pairwise dissimilarities in the original space, for a given dissimilarity measure. To this end, we use auto-encoder (AE) and encoder-only neural networks to learn elastic dissimilarity measures, e.g., dynamic time warping (DTW), that are central to time series classification (Bagnall et al., 2017). The learned representations are used in the context of one-class classification (Mauceri et al., 2020) on the datasets of UCR/UEA archive (Dau et al., 2019). Using a 1-nearest neighbor (1NN) classifier, we show that learned representations allow classification performance that is close to that of raw data, but in a space of substantially lower dimensionality. This implies substantial and compelling savings in terms of computational and storage requirements for nearest neighbor time series classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.