Abstract

Anomaly detection in time-series data is a significant research problem that has applications in multiple areas. Unsupervised anomaly detection is a fundamental aspect of developing intelligent automated systems. Existing work in this field has primarily focused on developing intelligent systems that use dimensionality reduction or regression-based approaches to annotate data based on a certain static threshold. Researchers in fields such as Natural Language Processing (NLP) and Computer Vision (CV) have realized considerable improvement by incorporating attention in prediction-related tasks. In this work, we propose an attention-based bi-directional long short term memory (Attention-Bi-LSTM) networks for anomaly detection on time-series data. It helps in assigning optimal weights to instances in sequential data. We evaluate the proposed approach on the entirety of the popularly used Numenta Anomaly Benchmark (NAB). Additionally, we also contribute by creating new baselines on the NAB with recent models such as REBM, DAGMM, LSTM-ED, and Donut, which have not been previously used on the NAB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call