Abstract

It has been demonstrated that the worst performance rule (WPR) effect can occur as a result of statistical dependencies in the data. Here, we examine whether this might also be the case for Spearman's law of diminishing returns (SLODR). Two proposed SLODR criteria are the skewness of the estimated latent ability factor and the correlation between this latent ability and within-individual residual variance. Using four publicly available datasets, covering quite different dimensions of behavior, we show that both these criteria are affected by the correlation between within-individual average performance and variance on the test scores. However, the influence of this correlation on the two criteria goes in opposite directions, which suggests that it generally might be difficult to get results that unambiguously support SLODR. These results might have far-reaching implications for the literature, to the extent that various research findings attributed to human cognitive functioning might in fact be due to trivial statistical dependencies in data. This is an important issue to address for future research.

Highlights

  • In the present study, we will investigate if recent findings concerning the worst performance rule (WPR) generalize to Spearman's law of diminishing returns (SLODR).1.1

  • The poor model fits are not necessarily a problem here, as the objective with the present study is to assess if SLODR criteria are systematically influenced by the correlation between within-individual mean (WIM) and within-individual standard deviation (WISD) rather than to identify models that are good at predicting associations between performances

  • The findings indicate that the skewness of latent ability scores and the correlation between these latent ability scores and within-individual residual variance tend to reflect the correlation between the participants' within-individual mean (WIM) and variance (WISD) on the test scores

Read more

Summary

Introduction

We will investigate if recent findings concerning the worst performance rule (WPR) generalize to Spearman's law of diminishing returns (SLODR).1.1. According to Spearman (1927) law of diminishing returns (SLODR), intelligence test scores tend to be more g saturated and strongly correlated among those with assumed low g compared to those with higher g. This has been called the differentiation hypothesis (Garrett, 1946). Deary et al (1996) divided Irish schoolchildren (N 1⁄4 10, 535) into high and low scorers based on their results on the Differential Aptitude Tests (DAT) verbal reasoning subtest and calculated the amount of variance in seven other subtests (numerical ability, abstract reasoning, clerical speed and accuracy, mechanical reasoning, space relations, spelling, and language usage) accounted for by a first unrotated principal component. If the division was based on space relations and numerical ability, the corresponding values were 46.2% and 40.9% for high scorers and 52.9% and 46.2% for low scorers

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call