Abstract
Jones and Copas (1986) present theoretical and simulation results on the relative merits of a Stein predictor (Copas, 1983) and the ordinary least squares predictor in the usual linear multiple regression model, when certain distributional properties of the regressor variables arising in the past differ from those for which predictions are to be made. Here, extension is made to the practical situation where the true regression parameters are unknown. A hypothesis testing procedure is developed to help determine which of shrinkage and least squares is preferable in any given instance. This approach is applied to explain some empirical evidence on the comparative merits of the two procedures, recently given by Berk (1984).

This publication has 11 references indexed in Scilit: