Summary Hydrologic time series are often characterized by temporal changes that give rise to non-stationarity. When the distribution describing the data changes over time, it is important to detect these changes so that correct inferences can be drawn from the data. The Lombard test, a non-parametric rank-based test to detect change points in the moments of a time series, has been recently used in the hydrologic literature to detect change points in the mean and variance. Little is known, however, about the performance of this test in detecting changes in variance, despite the potentially large impacts that these changes (shifts) could have when dealing with extremes. Here we address this issue in a Monte Carlo simulation framework. We consider a number of different situations that can manifest themselves in hydrologic time series, including the dependence of the results on the magnitude of the shift, significance level, sample size and location of the change point within the series. Analyses are performed considering abrupt changes in variance occurring with and without shifts in the mean. The results show that the power of the test in detecting change points in variance is small when the changes are small. It is large when the change point occurs close to the middle of the time series, and it increases nonlinearly with increasing sample size. Moreover, the power of the test is greatly reduced by the presence of change points in mean. We propose removing the change in the mean before testing for change points in variance. Simulation results demonstrate that this strategy effectively increases the power of the test. Finally, the Lombard test is applied to annual peak discharge records from 3686 U.S. Geological Survey stream-gaging stations across the conterminous United States, and the results are discussed in light of the insights from the simulations’ results.
Read full abstract