Abstract
In this article I present a new approach to model more realistically the variability of financial time series. I develop a Markov-ARCH model that incorporates the features of both Hamilton's switching-regime model and Engle's autoregressive conditional heteroscedasticity (ARCH) model to examine the issue of volatility persistence in the monthly excess returns of the three-month treasury bill. The issue can be resolved by taking into account occasional shifts in the asymptotic variance of the Markov-ARCH process that cause the spurious persistence of the volatility process. I identify two periods during which there is a regime shift, the 1974:2–1974:8 period associated with the oil shock and the 1979:9–1982:8 period associated with the Federal Reserve's policy change. The variance approached asymptotically in these two episodes is more than 10 times as high as the asymptotic variance for the remainder of the sample. I conclude that regime shifts have a greater impact on the properties of the data, and I cannot reject the null hypothesis of no ARCH effects within the regimes. As a consequence of the striking findings in this article, previous empirical results that adopt an ARCH approach in modeling monthly or lower frequency interest-rate dynamics are rendered questionable.