Given a time series in R n with a piecewise constant mean and independent noises, we propose an exact dynamic programming algorithm for minimizing a least-squares criterion with a multiscale penalty, favoring well-spread changepoints. This penalty was proposed by Verzelen et al. and achieves optimal rates for changepoint detection and changepoint localization in a non-asymptotic scenario. Our proposed algorithm, Multiscale Functional Pruning Optimal Partitioning (Ms.FPOP), extends functional pruning ideas presented in Rigaill and Maidstone et al. to multiscale penalties. For large signals ( n ≥ 10 5 ) with sparse changepoints, Ms.FPOP is shown empirically to be quasi-linear and faster than the Pruned Exact Linear Time (PELT) method of Killick et al. applied to the multiscale penalty of Verzelen et al. which exhibits quadratic slowdown in these cases. We propose an efficient implementation of Ms.FPOP coded in C++ interfaced with R that can segment profiles of up to n = 10 6 in a matter of seconds. Our algorithm works for slightly more general multiscale penalties. In particular, it allows a minimum segment length to be imposed. Using simple simulations we then show that where profiles are sufficiently large ( n ≥ 10 4 ), Ms.FPOP using the multiscale penalty of Verzelen et al. is typically more powerful than optimizing a least-squares criterion with the BIC penalty of Yao, a criterion that was shown by Fearnhead and Rigaill to perfom well across a wide range of scenarios. Supplementary materials for this article are available online.
Read full abstract