Abstract

As part of an ongoing investigation of students’ learning in first semester upper-division quantum mechanics, we needed a high-quality conceptual assessment instrument for comparing outcomes of different curricular approaches. The process of developing such a tool started with converting a preliminary version of a 14-item open-ended quantum mechanics assessment tool (QMAT) to a multiple-choice (MC) format. Further question refinement, development of effective distractors, adding new questions, and robust statistical analysis has led to a 31-item quantum mechanics concept assessment (QMCA) test. The QMCA is used as post-test only to assess students’ knowledge about five main topics of quantum measurement: the time-independent Schrödinger equation, wave functions and boundary conditions, time evolution, and probability density. During two years of testing and refinement, the QMCA has been given in alpha (N=61) and beta versions (N=263) to students in upper division quantum mechanics courses at 11 different institutions with an average post-test score of 54%. By allowing for comparisons of student learning across different populations and institutions, the QMCA provides instructors and researchers a more standard measure of effectiveness of different curricula or teaching strategies on student conceptual understanding of quantum mechanics. In this paper, we discuss the construction of effective distractors and the use of student interviews and expert feedback to revise and validate both questions and distractors. We include the results of common statistical tests of reliability and validity, which suggest the instrument is presently in a stable, usable, and promising form.3 MoreReceived 22 September 2014DOI:https://doi.org/10.1103/PhysRevSTPER.11.010110This article is available under the terms of the Creative Commons Attribution 3.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.Published by the American Physical Society

Highlights

  • Investigations of student learning in introductory physics over the last several decades have helped us better assess student conceptual understanding and identify and address common difficulties in learning physics [1]

  • Ongoing development of many curricula and classroom interventions were driven in part by data collected from research-based assessment instruments, such as the Force Concept Inventory (FCI), [2] Force and Motion Conceptual Evaluation (FMCE) [3], Concept Survey of Electricity and Magnetism (CSEM) [4], and Brief Electricity and Magnetism Assessment (BEMA) [5]

  • In order to address these scalability and usability issues, we set out to build on the existing quantum mechanics assessment tool (QMAT) and craft a quantum mechanics concept assessment (QMCA), a multiple-choice (MC) tool that could be more and objectively graded and has a potential to be used as a tool by a wide range of faculty to provide a meaningful measure of students’ performances on conceptual questions in upper-division quantum mechanics

Read more

Summary

INTRODUCTION

It was intended to assess a subset of learning goals identified for first-semester upperdivision undergraduate quantum mechanics courses by faculty that commonly teach these courses [18] Many questions in this test were motivated by previous research on student difficulties. The QMAT incorporates research findings on student difficulties in advanced undergraduate QM and pays attention to the alignment of the assessment tool with course learning objectives and instructional design [1], it suffers from a complicated and unreliable scoring rubric, with correspondingly limited validation studies. In order to address these scalability and usability issues, we set out to build on the existing QMAT and craft a quantum mechanics concept assessment (QMCA), a multiple-choice (MC) tool that could be more and objectively graded and has a potential to be used as a tool by a wide range of faculty to provide a meaningful measure of students’ performances on conceptual questions in upper-division quantum mechanics. We conclude with some interpretation of results and potential uses of this instrument

RESEARCH AND DEVELOPMENT METHODOLOGY
Development of item distractors
ALPHA STUDY
Example 1
Example 2
Example 3
Example 4
BETA STUDY
Data collection and results
Results for five concept framework or subtopics
TEST VALIDITY AND RELIABILITY
Expert validation
Student validation
Statistical analysis
Item difficulty index
Item discrimination
Ferguson’s delta
Kuder-Richardson reliability index
SUMMARY AND DISCUSSION
Uses of QMCA
Findings
Future work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call