Abstract

We study the problem of estimating θ from data Y ~ N(θ,σ2) under squared-error loss. We define three new scalar minimax problems in which the risk is weighted by the size of θ. Simple thresholding gives asymptotically minimax estimates in all three problems. We indicate the relationships of the new problems to each other and to two other neo-classical problems: the problems of the bounded normal mean and of the risk-constrained normal mean. Via the wavelet transform, these results have implications for adaptive function estimation in two settings: estimating functions of unknown type and degree of smoothness in a global l2 norm; and estimating a function of unknown degree of local Holder smoothness at a fixed point. In the latter setting, the scalar minimax results imply: Lepskii's results that it is not possible fully to adapt the unknown degree of smoothness without incurring a performance cost; and that simple thresholding of the empirical wavelet transform gives an estimate of a function at a fixed point which is, to within constants, optimally adaptive to unknown degree of smoothness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.