Abstract

Least absolute deviation is proposed as a robust estimator to solve the problem when the error has an asymmetric heavy-tailed distribution or outliers. In order to be insensitive to the above situation and select the truly important variables from a large number of predictors in the linear regression, this paper introduces a two-stage variable selection method named relaxed lad lasso, which enables the model to obtain robust sparse solutions in the presence of outliers or heavy-tailed errors by combining least absolute deviation with relaxed lasso. Compared with lasso, this method is not only immune to the rapid growth of noise variables but also maintains a better convergence rate, which is Opn−1/2. In addition, we prove that the relaxed lad lasso estimator has the property of consistency at large samples; that is, the model selects the number of important variables with a high probability of convergence to one. Through the simulation and empirical results, we further verify the outstanding performance of relaxed lad lasso in terms of prediction accuracy and the correct selection of informative variables under the heavy-tailed distribution.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.