Welcome to the Summer 2016 issue of Educational Measurement: Issues and Practice. In this issue, you will find some light and not so light summer reading—two substantive articles, an instructional module on a hot topic (simulation studies), and a new feature, the Gallery of Winners that showcases the top 10 graphics and visuals submitted for NCME's 2016 competition. Many of you attending the NCME breakfast this past April in Washington, DC, may remember seeing some of the winning entries. The first article is titled “Validating English Language Proficiency Assessment Uses for English Learners: Academic Language Proficiency and Content Assessment Performance,” written by Mikyung Kim Wolf and Molly Faulkner-Bond of the Educational Testing Service. As these researchers note, many states use standards-based English language proficiency test scores to support high-stakes decisions for students with limited ability in English. Using a series of hierarchical linear models, this work unpacks the complex relationships among social language use, academic language abilities, and performance on English language proficiency tests of the type used in states’ accountability systems. This work stresses the importance of looking more deeply at the constructs underlying most English language proficiency tests, particularly when these test scores have implications for policy and practice. “It was the best of tests, it was the worst of tests, it reflected a policy of wisdom, it reflected a policy of foolishness.” This is the opening line of our second article, “A Tale of Two Tests (and of Two Examinees)” by Amanda L. Clauser and her colleague at the National Board of Medical Examiners, Howard Wainer. This article, I suspect, will be of interest to both educational measurement specialists and policymakers. Clauser and Wainer show how making high-stakes decisions on the basis of multiple measures—a common practice in U.S. schools—can lead to an increased likelihood of falsely rejecting otherwise qualified examinees. They argue persuasively for thinking more about the tradeoff of making false positive vs. false negative decisions or classifications. It is a good read. To further our commitment to improve the teaching and learning of methods and statistical practices in educational measurement, we have a module in our Instructional Topics in Educational Measurement Series (ITEMS) titled “Conducting Simulation Studies in Psychometrics” developed and written by colleagues from the National Board of Medical Examiners, Richard A. Feinberg and Jonathan D. Rubright. Simulation studies are, more and more, being carried on a host of psychometric issues—including new methods for differential item functioning (DIF) analysis, equating studies, factor analysis, and subscore validity studies. Using code written in R, and made available to readers, Feinberg and Rubright illustrate, using a worked example, how to design and carry out a simulation study. This is must reading for quantitative (i.e., psychometric) researchers because Monte Carlo simulation studies have become a powerful tool for testing model-based assumptions and parameter estimates in something of a controlled environment. My thanks to our two colleagues for taking on the development of this instructional module. In addition to these three substantive articles, this issue offers Summer readers with a bit lighter fare: a close-up look at the winners of NCME's Spring 2016 Visuals competition. Our colleague Katherine Furgol Castellano did a fantastic job organizing the competition, rounding up judges, notifying the winners, and making the entire competition fun and informative for our NCME friends and colleagues. Our heartfelt thanks go to Katherine. As we go to press, the annual NCME meeting in Washington, DC, has come to a close. This year's theme Foundations and Frontiers: Advancing Educational Measurement for Research, Policy, and Practice offered us a particularly timely framework for the meeting. Kudos to Andrew Ho and Matt Johnson for all their hard work as the Annual Meeting Co-Chairs. They made a very challenging task look easy. My colleagues on the editorial team and I will be spending more than a few afternoons this spring and summer combing through the NCME program and the affiliated online paper repository looking for papers to publish. In addition, as always, we welcome your suggestions for commentaries, special issue topics, and instructional modules and look forward to hearing from all of you. Enjoy the summer!
Read full abstract