Abstract
In the random coefficients binary choice model, a binary variable equals 1 iff an index $X^\top\beta$ is positive. The vectors $X$ and $\beta$ are independent and belong to the sphere $\mathbb{S}^{d-1}$ in $\mathbb{R}^{d}$. We prove lower bounds on the minimax risk for estimation of the density $f_{\beta}$ over Besov bodies where the loss is a power of the $L^p(\mathbb{S}^{d-1})$ norm for $1\le p\le \infty$. We show that a hard thresholding estimator based on a needlet expansion with data-driven thresholds achieves these lower bounds up to logarithmic factors.
Highlights
In the random coefficients binary choice model, a binary variable equals 1 iff an index X⊤β is positive
There, rates of convergence for the Lp-losses for 1 ≤ p ≤ ∞ over Sobolev ellipsoïds based on the same Lp space are obtained under similar assumptions for choices of the smoothing parameters which depend on unknown parameters of the Sobolev ellipsoïds
We show that the estimator in [10] can be written as a plug-in of a linear needlet estimator
Summary
Discrete choice models (see, e.g., [21]) have applications in many areas ranging from planning of public transportation, economics of industrial organizations, evaluation of public policies, among others. There, rates of convergence for the Lp-losses for 1 ≤ p ≤ ∞ over Sobolev ellipsoïds based on the same Lp space (as well as confidence intervals for the value of the density at a point, treatment of endogenous regressors, and of models where some coefficients are nonrandom) are obtained under similar assumptions for choices of the smoothing parameters which depend on unknown parameters of the Sobolev ellipsoïds It is assumed in [10] that the support of β lies in an (unknown) hemisphere, namely, that there exists n (unknown) in Sd−1 such that P(n⊤β > 0) = 1.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have