Abstract
Cognitive task analysis (CTA) is enjoying growing popularity in both research and practice as a foundational element of instructional design. However, there exists relatively little research exploring its value as a foundation for training through controlled studies. Furthermore, highly individualized approaches to conducting CTA do not permit broadly generalizable conclusions to be drawn from the findings of individual studies. Thus, examining the magnitude of observed effects across studies from various domains and CTA practitioners is essential for assessing replicable effects. This study reports the findings from a meta-analysis that examines the overall effectiveness of CTA across practitioners and settings in relation to other means for identifying and representing instructional content. Overall, the effect of CTA-based instruction is large (Hedges’s g = 0.871). However, effect sizes vary substantially by both CTA method used and training context. Though limited by a relatively small number of studies, the notable effect size indicates that the information elicited through CTA provides a strong basis for highly effective instruction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Cognitive Engineering and Decision Making
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.