Abstract
Traditional anomaly detection methods in time series data often struggle with inherent uncertainties like noise and missing values. Indeed, current approaches mostly focus on quantifying epistemic uncertainty and ignore data-dependent uncertainty. However, consideration of noise in data is important as it may have the potential to lead to more robust detection of anomalies and a better capability of distinguishing between real anomalies and anomalous patterns provoked by noise. In this paper, we propose LSTMAE-UQ (Long Short-Term Memory Autoencoder with Aleatoric and Epistemic Uncertainty Quantification), a novel approach that incorporates both aleatoric (data noise) and epistemic (model uncertainty) uncertainties for more robust anomaly detection. The model combines the strengths of LSTM networks for capturing complex time series relationships and autoencoders for unsupervised anomaly detection and quantifies uncertainties based on the Bayesian posterior approximation method Monte Carlo (MC) Dropout, enabling a deeper understanding of noise recognition. Our experimental results across different real-world datasets show that consideration of uncertainty effectively increases the robustness to noise and point outliers, making predictions more reliable for longer periodic sequential data.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.