Abstract

Anomaly detection for log sequences is a necessary task for system intelligent operation and fault diagnosis. In a log sequence, adjacent logs have the property of local correlation, while long-distance logs have remote dependencies. It is helpful to fully mine these information during modeling for improving the performance of anomaly detection. Meanwhile, there are some redundant information or noise in the log sequence, which has no contribution to the detection, and may even bring negative impact. The existing methods for log sequence anomaly detection do not take the above problems into account when constructing models. In this paper, we propose LSADNET, an unsupervised log sequence anomaly detection network based on local information extraction and globally sparse Transformer model. LSADNET applies multi-layer convolution to capture the local correlation between adjacent logs, and utilizes Transformer to learn the global dependency among long-distance logs. Meanwhile, we propose a globally sparse Transformer model to improve the self-attention mechanism, which can help to retain important information adaptively and eliminate the irrelevant information in the log sequence. In addition, according to the co-occurrence mode of log templates, we put forward the calculation formula of log template transfer value, and apply it to log vectorization. Through sufficient experiments on two public datasets, it is confirmed that LSADNET has better performance than the state-of-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call