Abstract

The ability to assess students’ content knowledge and make meaningful comparisons of student performance is an important component of instruction. ACS exams have long served as tools for standardized assessment of students’ chemistry knowledge. Because these exams are designed by committees of practitioners to cover a breadth of topics in the curriculum, they may contain items that an individual instructor may not cover during classroom instruction and therefore chooses not to assess. For the instructor to make meaningful comparisons between his or her students and the national norm sample, the instructor needs norms that are generated upon the basis of the subset of items he or she used. The goal of this project was to investigate the effects of norm stability when items were removed from ACS General Chemistry Exams. This was achieved by monitoring the average change in percentile for students as items were removed from the exam and noting when average change crossed a specified threshold. An exploration of subset norm stability for three commonly used ACS General Chemistry Exams is presented along with implications for research and instruction.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.