Abstract

Although early advocates of absolute error methods like Boscovitch, Laplace, and Edgeworth all suggested ingenious methods for minimizing sums of absolute errors for bivariate regression problems, it was not until the introduction of the simplex algorithm in the late 1940s, and the formulation of the l 1 regression problem as a linear program somewhat later, that a practical, general method for computing absolute error regression estimates was made available. We have already seen that the linear programming formulation of quantile regression is an indispensable tool for understanding its statistical behavior. Like the Euclidean geometry of the least-squares estimator, the polyhedral geometry of minimizing weighted sums of absolute errors plays a crucial role in understanding these methods. This chapter begins with a brief account of the classical theory of linear programming, stressing its geometrical nature and introducing the simplex method. The simplex approach to computing quantile regression estimates is then described and the special role of simplex-based methods for “sensitivity analysis” is emphasized. Parametric programming in a variety of quantile regression contexts is treated in Section 6.3. Section 6.4 describes some recent developments in computation that rely on “interior point” methods for solving linear programs. These new techniques are especially valuable in large quantile regression applications, where the simplex approach becomes impractical. Further gains in computational efficiency are possible by preprocessing of the linear programming problems as described in Section 6.5. Interior point methods are also highly relevant for nonlinear quantile regression problems, a topic addressed in Section 6.6.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call