Abstract

In solving robust linear regression problems, the parameter vector x, as well as an additional parameter s that scales the residuals, must be estimated simultaneously. A widely used method for doing so consists of first improving the scale parameter s for fixed x, and then improving x for fixed s by using a quadratic approximation to the objective function g. Since improving x is the expensive part of such algorithms, it makes sense to define the new scale s as a minimizes of g for fixed x. A strong global convergence analysis of this conceptual algorithm is given for a class of convex criterion functions and the so-called H- or W-approximations to g. Moreover, some appropriate finite and iterative subalgorithms for minimizing g with respect to s are discussed. Furthermore, the possibility of transforming the robust regression problem into a nonlinear least-squares problem is discussed. All algorithms described here were tested with a set of test problems, and the computational efficiency was compared with that of published algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call