Abstract

Short answer questions (SAQs) and other forms of similarly structured examination methods are employed as evaluation tools to gauge the competency of medical students at various levels of study. However, item analyses for these questions have rarely been conducted. Evaluating the quality of examination questions through item analysis ensures that stakeholders, especially those undergoing learning in various flexible pathways in medical education, are provided with reliable and relevant assessments, promoting effective learning and competency development. In this study, we performed item analyses on SAQs by extracting the passing index (PI) and discrimination index (DI) for each sub-question. Data were analysed using Microsoft Excel (Microsoft Corporation, United States) and Jeffreys’s Amazing Statistics Program (JASP) (University of Amsterdam, The Netherlands). Twentyseven sub-questions from five SAQs were analysed. The DI of the sub-questions ranged from 0.043 to 0.935 with a mean of 0.449 ± 0.223, while the PI returned a range of 0.012 to 0.971 with a mean value of 0.597 ± 0.246. In conclusion, the SAQs administered during a professional examination of preclinical medical students exhibited an acceptable range and mix of PI and DI values. However, improvements must be made to the sub-questions that return poor PI and DI values.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.