Abstract
This article studies inference in the high-dimensional linear regression model with outliers. Sparsity constraints are imposed on the vector of coefficients of the covariates. The number of outliers can grow with the sample size while their proportion goes to 0. We propose a two-step procedure for inference on the coefficients of a fixed subset of regressors. The first step is a based on several square-root lasso -norm penalized estimators, while the second step is the ordinary least squares estimator applied to a well-chosen regression. We establish asymptotic normality of the two-step estimator. The proposed procedure is efficient in the sense that it attains the semiparametric efficiency bound when applied to the model without outliers under homoscedasticity. This approach is also computationally advantageous, it amounts to solving a finite number of convex optimization programs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.