Abstract

The National Board of Medical Examiners is currently developing the Assessment of Professional Behaviors, a multisource feedback (MSF) tool intended for formative use with medical students and residents. This study investigated whether missing responses on this tool can be considered random; evidence that missing values are not random would suggest response bias, a significant threat to score validity. Correlational analyses of pilot data (N = 2,149) investigated whether missing values were systematically related to global evaluations of observees. The percentage of missing items was correlated with global evaluations of observees; observers answered more items for preferred observees compared with nonpreferred observees. Missing responses on this MSF tool seem to be nonrandom and are instead systematically related to global perceptions of observees. Further research is needed to determine whether modifications to the items, the instructions, or other components of the assessment process can reduce this effect.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call