Abstract

Change-point analysis is the task of finding abrupt (and significant) changes in the underlying model of a signal or time series. Change-point detection methods typically involve specifying the maximum number of segments to search for and the minimum segment length. However, there is no objective way to pre-specify these two parameters, and it mostly depends upon the particular application. Within this framework, a recursive optimization algorithm is developed that is capable of exploring and fine tuning these two input parameters, and optimally segmenting a time series. This multiple change-point detection technique therefore addresses a wide class of real-life contexts and problems where the identification of optimal level shifts in a time series is the main goal. Extensive simulation results are presented and a real-life example is given to illustrate the implementation of the developed scheme in practice and to unfold its capabilities. Concluding remarks and suggestions for future research are also provided.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call