Abstract

Past research has consistently shown that tests measuring specific cognitive abilities provide little if any incremental validity over tests of general mental ability when predicting performance on the job. In this study, we suggest that the seeming lack of incremental validity may have been due to the type of content that has traditionally been assessed. Therefore, we hypothesised that incremental validity can be obtained using specific cognitive abilities that are less highly correlated with g and are matched to the tasks performed on the job. To test this, we examined a recently developed performance‐based measure that assesses a number of cognitive abilities related to training performance. In a sample of 310 US Navy student pilots, results indicated that performance‐based scores added sizeable incremental validity to a measure of g. The significant increases in R2 ranged from .08 to .10 across criteria. Similar results were obtained after correcting correlations for range restriction, though the magnitude of incremental validity was slightly smaller (ΔR2 ranged from .05 to .07).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call