Abstract

Five years of longitudinal data for general chemistry student assessments at the University of Georgia have been analyzed using item response theory (IRT). Our analysis indicates that minor changes in question wording on exams can make significant differences in student performance on assessment questions. This analysis encompasses data from over 6100 students, giving an extremely small statistical uncertainty. IRT provided us with a new insight into student performance on our assessments that is also important to the chemical education community. In this paper, IRT, in conjunction with computerized testing, indicates how nuances in question wording impact student performance on assessments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call