Abstract
The estimator of the vector parameter in a linear regression, known as the least absolute deviation (LAD) estimator, is defined by minimizing the sum of the absolute values of the residuals. However, the loss function lacks differentiability. In this study, we propose a convolution-type kernel smoothed least absolute deviation (SLAD) estimator based upon smoothing the objective function within the context of linear regression. Compared with the LAD estimator, the loss function of SLAD estimator is asymptotically differentiable, and the resulting SLAD estimator can yield a lower mean squared error. Furthermore, we demonstrate several interesting asymptotic properties of the SLAD method. Numerical studies and real data analysis confirm that the proposed SLAD method performs remarkably well under finite sample sizes.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have