Abstract

Multivariate time series anomaly detection has made significant progress and has been studied in many fields. One of the difficulties in time-series data analysis is the complex nonlinear dependencies between multiple time steps and multiple variables. Therefore, detecting anomalies in these data is challenging. Although many studies used classical attention mechanisms to model the temporal patterns of data, few have combined multiple attention mechanisms and analyzed the data’s temporal characteristics and feature correlations. Therefore, we propose an autocorrelation and attention mechanism-based anomaly detection (ACAM-AD) framework that combines an autocorrelation model based on the Autoformer model, which is superior to the self-attention mechanism, a multi-head graph attention network, and a dot-product attention mechanism to model the complex dependencies of data considering temporal and feature dimensions. The autoregressive model is parallelized with the neural network, and a sparse autocorrelation mechanism and sparse graph attention network are used to reduce model complexity. Experiments on public datasets show that the model is effective and performs better than the baseline model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call