Abstract

When the unknown regression function of a single variable is known to have derivatives up to the (γ+1)th order bounded in absolute values by a common constant everywhere or a.e. (i.e., (γ+1)th degree of smoothness), the minimax optimal rate of the mean integrated squared error (MISE) is stated as 1n2γ+22γ+3 in the literature. This paper shows that: (i) if n≤γ+12γ+3, the minimax optimal MISE rate is lognnlog(logn) and the optimal degree of smoothness to exploit is roughly maxlogn2loglogn,1; (ii) if n>γ+12γ+3, the minimax optimal MISE rate is 1n2γ+22γ+3 and the optimal degree of smoothness to exploit is γ+1.The fundamental contribution of this paper is a set of metric entropy bounds we develop for smooth function classes. Some of our bounds are original, and some of them improve and/or generalize the ones in the literature (e.g., (Kolmogorov and Tikhomirov, 1959)). Our metric entropy bounds allow us to show phase transitions in the minimax optimal MISE rates associated with some commonly seen smoothness classes as well as non-standard smoothness classes, and can also be of independent interest outside the nonparametric regression problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.