Abstract

Since the time of Gauss, it has been generally accepted that $\ell_2$-methods of combining observations by minimizing sums of squared errors have significant computational advantages over earlier $\ell_1$-methods based on minimization of absolute errors advocated by Boscovich, Laplace and others. However, $\ell_1$-methods are known to have significant robustness advantages over $\ell_2$-methods in many applications, and related quantile regression methods provide a useful, complementary approach to classical least-squares estimation of statistical models. Combining recent advances in interior point methods for solving linear programs with a new statistical preprocessing approach for $\ell_1$-type problems, we obtain a 10- to 100-fold improvement in computational speeds over current (simplex-based) $\ell_1$-algorithms in large problems, demonstrating that $\ell_1$-methods can be made competitive with $\ell_2$-methods in terms of computational speed throughout the entire range of problem sizes. Formal complexity results suggest that $\ell_1$-regression can be made faster than least-squares regression for n sufficiently large and p modest.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.