Abstract

Deary and Pagliari (1991) pointed out that Spearman had presented data showing the same pattern in correlations found by Detterman and Daniel (1989). Correlations among tests are about twice as high for low-IQ subjects as for high-IQ subjects. Spearman suggested that this effect be called the “law of diminishing returns”. He thought that smarter people had more “ g”. He also reasoned that, as in psychophysical effects, a constant increment would be less discriminable when added to a large base (high-IQ) than to a small base (low-IQ). These differences in discriminability are what caused the correlations to be lower in groups of high-IQ subjects. But Spearman was wrong on two counts. First, any constant added to either or both members of pairs of observations has no effect on the resulting correlation. So size of base could not produce differences in correlation. Second, if g is defined as whatever causes tests to be positively intercorrelated, then low-IQ subjects have more g than high-IQ subjects because their test scores are more highly correlated. Amount of g, as g interpreted by Spearman, represents stupidity, not intelligence. An alternative explanation for these effects is provided.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call