Abstract

Ridge regression is a well-known technique to estimate the coefficients of a linear model. The method of regularization is a similar approach commonly used to solve underdetermined linear equations with discrete noisy data. When applying such a technique, the choice of the smoothing (or regularization) parameter $h$ is crucial. Generalized cross-validation (GCV) and Mallows' $C_L$ are two popular methods for estimating a good value for $h,$ from the data. Their asymptotic properties, such as consistency and asymptotic optimality, have been largely studied [Craven and Wahba (1979); Golub, Heath and Wahba (1979); Speckman (1985)]. Very interesting convergence results for the actual (random) parameter given by GCV and $C_L$ have been shown by Li (1985, 1986). Recently, Girard (1987, 1989) has proposed fast randomized versions of GCV and $C_L.$ The purpose of this paper is to show that the above convergence results also hold for these new methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call