Abstract

ABSTRACTLeast absolute deviation (LAD) regression is an important alternative to ordinary least squares (OLS) regression in linear models. A surprising result in Li and Duan (1989) showed that OLS can be used for dimension reduction in single-index models as long as the predictor distribution satisfies a global linear conditional mean assumption. The proposal in Li and Duan (1989) has two limitations. First, it is well known that OLS is sensitive to outliers and fails in the case of heavy-tailed error distribution. Second, the global linearity assumption for the predictor distribution can be violated when there is a nonlinear relationship among the predictors. To address these limitations, cluster-based LAD for dimension reduction is proposed in this article. By inheriting the benefit of LAD over OLS in linear models, our proposal becomes more robust to outliers or heavy-tailed error distribution. We also replace the global linearity assumption with the more flexible local linearity assumption through k-m...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call