Abstract

ObjectiveThe purposes of this study were to describe the questionnaire development process for evaluating elements of an evidence-based practice (EBP) curriculum in a chiropractic program and to report on initial reliability and validity testing for the EBP knowledge examination component of the questionnaire. MethodsThe EBP knowledge test was evaluated with students enrolled in a doctor of chiropractic program in the University of Western States. The initial version was tested with a sample of 374 and a revised version with a sample of 196 students. Item performance and reliability were assessed using item difficulty, item discrimination, and internal consistency. An expert panel assessed face and content validity. ResultsThe first version of the knowledge examination demonstrated a low internal consistency (Kuder-Richardson 20 = 0.55), and a few items had poor item difficulty and discrimination. This resulted in an expansion in the number of items from 20 to 40, as well as a revision of the poorly performing items from the initial version. The Kuder-Richardson 20 of the second version was 0.68; 32 items had item difficulties of between 0.20 and 0.80, and 26 items had item discrimination values of 0.20 or greater. ConclusionsA questionnaire for evaluating a revised EBP-integrated curriculum was developed and evaluated. Psychometric testing of the EBP knowledge component provided some initial evidence for acceptable reliability and validity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call