Abstract
This study proposes an empirical approach to account for lag-1 autocorrelation in detecting mean shifts in time series of white or red (first-order autoregressive) Gaussian noise using the penalized maximal t test or the penalized maximal F test. This empirical approach is embedded in a stepwise testing algorithm, so that the new algorithms can be used to detect single or multiple changepoints in a time series. The detection power of the new algorithms is analyzed through Monte Carlo simulations. It has been shown that the new algorithms work very well and fast in detecting single or multiple changepoints. Examples of their application to real climate data series (surface pressure and wind speed) are presented. An open-source software package (in R and FORTRAN) for implementing the algorithms, along with a user manual, has been developed and made available online free of charge.
Keywords