Abstract
Anomaly detection in a multivariate time series using unsupervised methods presents a formidable challenge. The existing strategies focused on delineating intrinsic patterns over a temporal dimension and outputting a better representation of the input series. Generally, anomaly detecting model derives a criterion, which is mostly the distance between the representations and the original data, for distinguishing anomalies. Anomaly Transformer is a recent model that achieves substantial advancements by introducing its proprietary criterion of association discrepancy. This new criterion leverages the fact that time points focus on their adjacent neighbors if normal; however, they lose this characteristic if they are anomalous. Anomaly Transformer focuses only on the association between time points and neglects the association between the sensors. In addition, the association discrepancy calculated by the attention score between time points and Gaussian distribution can sometimes be insufficient. To achieve better performance, we propose a spatial information enhanced transformer (SiET), which is a novel paradigm that integrates graph neural networks into the original Anomaly Transformer framework, engendering a high-dimensional association discrepancy. Theoretical proofs are provided to verify the reliability of our model. Further, SiET achieved results comparable to the state-of-the-art results on five unsupervised time-series anomaly detection benchmarks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.