Abstract

We study the convergence of $Z$-estimators $\widehat{\theta}(\eta)\in \mathbb{R}^{p}$ for which the objective function depends on a parameter $\eta$ that belongs to a Banach space $\mathcal{H}$. Our results include the uniform consistency over $\mathcal{H}$ and the weak convergence in the space of bounded $\mathbb{R}^{p}$-valued functions defined on $\mathcal{H}$. When $\eta$ is a tuning parameter optimally selected at $\eta_{0}$, we provide conditions under which $\eta_{0}$ can be replaced by an estimated $\widehat{\eta}$ without affecting the asymptotic variance. Interestingly, these conditions are free from any rate of convergence of $\widehat{\eta}$ to $\eta_{0}$ but require the space described by $\widehat{\eta}$ to be not too large in terms of bracketing metric entropy. In particular, we show that Nadaraya-Watson estimators satisfy this entropy condition. We highlight several applications of our results and we study the case where $\eta$ is the weight function in weighted regression.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call