Testing for a Moving Average Unit Root in Autoregressive Integrated Moving Average Models

Abstract
Test procedures for detecting overdifferencing or a moving average unit root in Gaussian autoregressive integrated moving average (ARIMA) models are proposed. The tests can be used when an autoregressive unit root is a serious alternative but the hypothesis of primary interest implies stationarity of the observed time series. This is the case, for example, if one wishes to test the null hypothesis that a multivariate time series is cointegrated with a given theoretical cointegration vector. A priori knowledge of the mean value of the observations turns out to be crucial for the derivation of our tests. In the special case where the differenced series follows a first-order moving average process, the proposed tests are exact and can be motivated by local optimality arguments. Specifically, when the mean value of the series is a priori known, we can obtain a locally best invariant (LBI) test that is identical to a one-sided version of the Lagrange multiplier test. But when the mean value is a priori not known, this test breaks down and we derive a locally best invariant unbiased (LBIU) test. After having tests in this special case, we develop extensions to general ARIMA models. These tests are asymptotic, but under the null hypothesis they have the same limiting distributions as in the just-mentioned special case. When the mean value is a priori known, an asymptotic χ2 1 distribution is obtained, when it is unknown, the limiting distribution agrees with that of the Cramer–von Mises goodness-of-fit test statistic.