Abstract

Time series classification has been considered as one of the most challenging problems in data mining and widely used in a broad range of fields, such as climate, finance, medicine and computer science. The main challenges of time series classification are to select the appropriate representation (feature extraction) of time series and choose the similarity metric between time series. Compared with the traditional feature extraction method, in this paper, we focus on the fusion of global features, local features and the interaction between them, while preserving the temporal information of the local features. Based on this strategy, a highly comparative approach to univariate time series classification is introduced that uses covariance matrices as interpretable features. From the perspective of probability theory, each covariance matrix can be seen as a zero-mean Gaussian distribution. Our idea is to incorporate covariance matrix into the framework of information geometry, which is to study the geometric structures on the manifolds of the probability distributions. The space of covariance matrices is a statistical (Riemannian) manifold and the geodesic distance is introduced to measure the similarity between them. Our method is to project each distribution (covariance matrix) to a vector on the tangent space of the statistical manifold. Finally, the classification is carried out in the tangent space which is a Euclidean space. Concepts of a structural and functional network are also presented which provide us an understanding of the properties of the data set and guide further interpretation to the classifier. Experimental evaluation shows that the performance of the proposed approach exceeded some competitive methods on benchmark datasets from the UCR time series repository.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call