Abstract

In this article we introduce a notion of depth in the regression setting. It provides the “rank” of any line (plane), rather than ranks of observations or residuals. In simple regression we can compute the depth of any line by a fast algorithm. For any bivariate dataset Z n of size n there exists a line with depth at least n/3. The largest depth in Z n can be used as a measure of linearity versus convexity. In both simple and multiple regression we introduce the deepest regression method, which generalizes the univariate median and is equivariant for monotone transformations of the response. Throughout, the errors may be skewed and heteroscedastic. We also consider depth-based regression quantiles. They estimate the quantiles of y given x, as do the Koenker-Bassett regression quantiles, but with the advantage of being robust to leverage outliers. We explore the analogies between depth in regression and in location, where Tukey's halfspace depth is a special case of our general definition. Also, Liu's simplicial depth can be extended to the regression framework.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.