Abstract

Letf n ? be the regularized solution of a general, linear operator equation,Kf 0 =g, from discrete, noisy datay i =g(x) +? i ,i=1,...,n, where? i are uncorrelated random errors. We consider the prominent method of generalized cross-validation (GCV) for choosing the crucial regularization parameter ?. The practical GCV estimate $$\overset{\lower0.5em\hbox{$\smash{\scriptscriptstyle\frown}$}}{\lambda } _V $$ and its "expected" counterpart ? V are defined as the minimizers of the GCV functionV(?) andEV(?), respectively, whereE denotes expectation. We investigate the asymptotic performance of ? V with respect to each of the following loss functions: the risk, anL 2 norm on the output errorKf n ??g, and a whole class of stronger norms on the input errorf n ??f 0. In the special cases of data smoothing and Fourier differentiation, it is known that asn??, ? V is asymptotically optimal (ao) with respect to the risk criterion. We show this to be true in general, and also extend it to theL 2 norm criterion. The asymptotic optimality is independent of the error variance, the ill-posedness of the problem and the smoothness index of the solutionf 0. For the input error criterion, it is shown that ? V is weakly ao for a certain class off 0 if the smoothness off 0 relative to the regularization space is not too high, but otherwise ? V is sub-optimal. This result is illustrated in the case of numerical differentiation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.