Abstract
The asymptotic optimality of Mallows' $C_L$ and generalized cross-validation is demonstrated in the setting of ridge regression. An application is made to spline smoothing in nonparametric regression. A counterexample is given to help understand why sometimes GCV may not be asymptotically optimal. The coefficient of variation for the eigenvalues of the information matrix must be large in order to guarantee the optimality of GCV. The proff is based on the connection between GCV and Stein's unbiased risk estimate.