Abstract

We consider the problem of estimating an unknown $\theta\in {\mathbb{R}}^n$ from noisy observations under the constraint that $\theta$ belongs to certain convex polyhedral cones in ${\mathbb{R}}^n$. Under this setting, we prove bounds for the risk of the least squares estimator (LSE). The obtained risk bound behaves differently depending on the true sequence $\theta$ which highlights the adaptive behavior of $\theta$. As special cases of our general result, we derive risk bounds for the LSE in univariate isotonic and convex regression. We study the risk bound in isotonic regression in greater detail: we show that the isotonic LSE converges at a whole range of rates from $\log n/n$ (when $\theta$ is constant) to $n^{-2/3}$ (when $\theta$ is uniformly increasing in a certain sense). We argue that the bound presents a benchmark for the risk of any estimator in isotonic regression by proving nonasymptotic local minimax lower bounds. We prove an analogue of our bound for model misspecification where the true $\theta$ is not necessarily nondecreasing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call