Abstract

In recent years, time-series data have emerged in a variety of application domains, such as wireless sensor networks and surveillance systems. To identify the similarity between time-series data, the Euclidean distance and its variations are common metrics that quantify the differences between time-series data. However, the Euclidean distances are limited by its inability to elastically shift with the time axis, which motivates the development of dynamic time warping (DTW) algorithms. While DTW algorithms have been proven very useful in diversified applications like speech recognition, their efficacy might be seriously affected by the resolution of the time-series data. However, time-series data of high resolution might take up a gigantic amount of main memory and storage space, and slow down the DTW analysis procedure. This makes the upscaling of DTW analysis more challenging, especially for in-memory data analytics platforms with limited NVM space. In this work, we propose a strategy to downsample time-series data to significantly reduce their size without seriously affecting the precision of the results obtained by DTW algorithms. In other words, this work proposes a technique to remove the unimportant details that are largely ignored by DTW algorithms. The efficacy of the proposed technique is verified by a series of experimental studies, where the results are quite encouraging.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call