Abstract

The detection of events in time series is an important task in several areas of knowledge where operations monitoring is essential. Experts often have to deal with choosing the most appropriate event detection method for a time series, which can be a complex task. There is a demand for benchmarking different methods in order to guide this choice. For this, standard classification accuracy metrics are usually adopted. However, they are insufficient for a qualitative analysis of the tendency of a method to precede or delay event detections. Such analysis is interesting for applications in which tolerance for "close" detections is important rather than focusing only on accurate ones. In this context, this paper proposes a more comprehensive event detection benchmark process, including an analysis of temporal bias of detection methods. For that, metrics based on the time distance between event detections and identified events (detection delay) are adopted. Computational experiments were conducted using real-world and synthetic datasets from Yahoo Labs and resources from the Harbinger framework for event detection. Adopting the proposed detection delay-based metrics helped obtain a complete overview of the performance and general behavior of detection methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.