Abstract

The difficulty in computing the least median of squares (LMS) estimate in multiple linear regression is due to the nondifferentiability and many local minima of the objective function. Several approximate, but not exact, algorithms have been suggested. This paper presents a method for computing the exact value of the LMS estimate in multiple linear regression. The LMS estimate is a special case of the least quantile of squares (LQS) estimate, which minimizes the qth smallest squared residual for a given data set. For LMS, $q = [n/2] + [(p + 1)/2]$ where $[ \, ]$ is the greatest integer function, n is the sample size, and p is the number of columns in the X matrix. The algorithm can compute a range of exact LQS estimates in multiple linear regression by considering $\left( {\begin{array}{*{20}c} n \\ {p + 1} \\ \end{array} } \right)$ possible $\theta $ values. It is based on the fact that each LQS estimate is the Chebyshev (or minimax) fit to some q element subset of the data. This yields a surprisingly easy algorithm for computing the exact LQS estimates. These and other estimates are used to study the stability of the LMS estimate in several examples.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.