Abstract

The problem of loss adaptation is investigated: given a fixed parameter, the goal is to construct an estimator that adapts to the loss function in the sense that the estimator is optimal both globally and locally at every point. Given the class of estimator sequences that achieve the minimax rate, over a fixed Besov space, for estimating the entire function a lower bound is given on the performance for estimating the function at each point. This bound is larger by a logarithmic factor than the usual minimax rate for estimation at a point when the global and local minimax rates of convergence differ. A lower bound for the maximum global risk is given for estimators that achieve optimal minimax rates of convergence at every point. An inequality concerning estimation in a two-parameter statistical problem plays a key role in the proof. It can be considered as a generalization of an inequality due to Brown and Low. This may be of independent interest. A particular wavelet estimator is constructed which is globally optimal and which attains the lower bound for the local risk provided by our inequality.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.