Abstract

Let $Y\in\R^n$ be a random vector with mean $s$ and covariance matrix $\sigma^2P_n\tra{P_n}$ where $P_n$ is some known $n\times n$-matrix. We construct a statistical procedure to estimate $s$ as well as under moment condition on $Y$ or Gaussian hypothesis. Both cases are developed for known or unknown $\sigma^2$. Our approach is free from any prior assumption on $s$ and is based on non-asymptotic model selection methods. Given some linear spaces collection $\{S_m, m\in\M\}$, we consider, for any $m\in\M$, the least-squares estimator $\hat{s}_m$ of $s$ in $S_m$. Considering a penalty function that is not linear to the dimensions of the $S_m$'s, we select some $\hat{m}\in\M$ in order to get an estimator $\hat{s}_{\hat{m}}$ with a quadratic risk as close as possible to the minimal one among the risks of the $\hat{s}_m$'s. Non-asymptotic oracle-type inequalities and minimax convergence rates are proved for $\hat{s}_{\hat{m}}$. A special attention is given to the estimation of a non-parametric component in additive models. Finally, we carry out a simulation study in order to illustrate the performances of our estimators in practice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call