Abstract

One aspect usually overlooked in studies evaluating the efficacy of educational interventions, is the extent to which posttest responses reflect actual changes in targeted latent constructs versus a more superficial shift in participants’ interpretations of assessment items after exposure to the intervention (i.e., response shift). We conducted a Monte Carlo simulation study to examine the impact of response shift on treatment effects. Our results indicate that treatment effects estimated with composite scores and latent variable models may be severely overestimated in certain conditions. We describe the use of a measurement invariance framework across groups and across time to mitigate response-shift bias. We also discuss the need to conceptualize response shift as a first step in the evaluation of interventions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call