Abstract

The Huber’s Criterion is a useful method for robust regression. The adaptive least absolute shrinkage and selection operator (lasso) is a popular technique for simultaneous estimation and variable selection. The adaptive weights in the adaptive lasso allow to have the oracle properties. In this paper we propose to combine the Huber’s criterion and adaptive penalty as lasso. This regression technique is resistant to heavy-tailed errors or outliers in the response. Furthermore, we show that the estimator associated with this procedure enjoys the oracle properties. This approach is compared with LAD-lasso based on least absolute deviation with adaptive lasso. Extensive simulation studies demonstrate satisfactory finite-sample performance of such procedure. A real example is analyzed for illustration purposes.

Highlights

  • Data subject to heavy-tailed errors or outliers are commonly encountered in applications which may appear either in response variables or in the predictors

  • In [36], the authors propose to treat the problem of robust model selection by combining least absolute deviation (LAD) loss and adaptive lasso penalty

  • This approach is compared with LAD-lasso based on least absolute deviation with adaptive lasso

Read more

Summary

Introduction

Data subject to heavy-tailed errors or outliers are commonly encountered in applications which may appear either in response variables or in the predictors. We consider here the regression problem with responses subject to heavy-tailed errors or outliers In this case, the Ordinary Least Square (OLS) estimator is reputed to be not efficient. In a second attempt, [39] provides a convex optimisation problem leading to a consistent in variable selection estimator He assigns adaptive weights for penalizing differently coefficients in the l1 penalty and calls this new penalty the adaptive lasso. In [36], the authors propose to treat the problem of robust model selection by combining LAD loss and adaptive lasso penalty. They obtain an estimator which is robust against outliers and enjoys a sparse representation.

Lasso-type estimator
Robust lasso-type estimator
The Huber’s Criterion with adaptive lasso
Tuning parameter estimation
Some remarks on scale invariance
Theoretical properties
Simulation results
Models used for simulations
Prediction accuracy
Selection ability
C O U SV
Hyperparameter choices
Comparison results
A real example
Proof of Lemma 1
Proof of Lemma 3
Lemma 4 and its proof
Computations
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.