The behavior of a time series may be affected by various factors. Changes in mean, variance, frequency, and auto-correlation are the most common. Change-Point Detection (CPD) aims to track down abrupt statistical characteristic changes in time series that can benefit many applications in different domains. As demonstrated in recently introduced CPD methodologies, deep learning approaches have the potential to identify more subtle changes. However, due to improper handling of data and insufficient training, these methodologies generate more false alarms and are not efficient enough in detecting change-points. In real-time CPD algorithms, preprocessed data plays a vital role in increasing the algorithm’s efficiency and minimizing false alarm rates. Therefore, preprocessing of data should be a part of the algorithm, but in the existing methods, preprocessing of data is done initially, and then the whole dataset is passed to the CPD algorithm. A new three-phase architecture is proposed to address this issue, in which all phases, from preprocessing to CPD, work in an adaptive manner. The phases are integrated into a pipeline, allowing the algorithm to work in real-time. Our proposed strategy performs optimally and consistently based on performance metrics resulting from experiments on real-world datasets and artifacts. This work effectively addresses the issue of non-stationary data normalization using deep learning approaches. To reduce noise and outliers from the data, a recursive version of singular spectrum analysis is introduced. It is demonstrated that the method’s performance has significantly improved by combining adaptive preprocessing with deep learning CPD techniques.
Read full abstract