Abstract

In their excellent elementary text Statistics, David Freedman and his coauthors [1] present the usual interpretation of the correlation of a bivariate data set as a measure of the vertical spread of the points in the data set about the least squares (regression) line and then present examples to show that restricting the range of one variable (or of both variables) of a bivariate data set can reduce the absolute value of the correlation. In the examples presented, this phenomenon, sometimes called attenuation of correlation, is easily seen in scatterplots of the data: the data that remain after removing the points outside the selected range appear to follow a weaker linear pattern than the original data set did. Freedman, Pisani, and Purves then pose the following problem [1, p. 154].

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.