This issue's cover features Getting to Responses by Ruth A. Childs and Susan Elgie from the Ontario Institute for Studies in Education. It is one of the two winning submissions to the 2015 EM:IP Cover Graphic/Data Visualization competition. This figure is a graphical depiction of a student's item response times on a computer-based test. Getting to Responses was developed from a study investigating how fifth-grade students deal with uncertainty as they respond to multiple-choice test items. This example shows the progress of one Grade 5 student through a computer-administered 20-item multiple-choice test based on a short video. Five of the 20 items were unanswerable from information in the video: three items were covered in the video, but did not include a correct option and two were not covered at all. Screen capture software was used to track student cursor movements. The study was particularly focused on students’ response behavior for the unanswerable items, which were designed to invoke uncertainty. Following examinees’ completion of the test, they participated with a researcher in a think-aloud session and brief Interview. Childs and Elgie developed the figure to facilitate identification of patterns of correct and incorrect responses for each examinee and comparison among examinees. Using a set of figures, the researchers could identify patterns of nonresponse, item changing and item reviewing behaviors, as well as items on which an examinee took a long time to respond (long horizontal segments) and times when an examinee was scrolling instead of looking at a specific item (slanted segments). The analysis also included excerpts from the think-aloud sessions; using NVivo software, the transcribed texts were aligned with the screen-capture video. Initial findings from these mixed-methods analyses were presented at the 2015 conferences of the American Educational Research Association (Childs, Elgie, Tang, & Ferguson, 2015a) and the Canadian Society for the Study of Education (Childs, Ferguson, Tang, & Elgie, 2015b). The figure can easily be created using standard software. Specifically, this figure was created in Excel from a spreadsheet containing six columns: the cumulative time to each action (responding or scrolling to another item), the number of the items involved in the action, and the action (leaving the item without responding, responding correctly, responding incorrectly, or responding to an unanswerable item). The symbols for each action were overlaid on the chart by using five “Marked Scatter” series based on the first column in combination with each of the other columns. Childs and Elgie's Getting to Responses is a relatively simple yet informative graphic about a student's response time by item. Although the type of graph—a step graph relating item ID to elapsed time—is not unique, the figure's incorporation of different symbols and line segments to tell a story make it original. Indeed, this submission to the EM:IPCover Graphic competition received the highest scores on the Originality criterion. This figure may also strike viewers as unique in its orientation or choice of axes. The axes are switched from how we are probably accustomed to seeing them where item ID is typically plotted on the x axis and elapsed time on the y axis. Perhaps, Childs and Elgie set up the graph as they did because this computer-based test allowed examinees to return to previously presented items. Thus, if elapsed time was on the y axis, the stepwise function would not always be increasing; it would involve “backwards” steps whenever students returned to previous items. However, by putting elapsed time on the x axis, Getting to Responses shows a decreasing trend when students respond to items in the order they are presented. Such a trend seems less natural or counterintuitive when considering elapsed time as a variable, and although this feature adds to the uniqueness of the figure, it may detract from its interpretability. The clear labels for the axis and plotting symbols help Getting to Responses stand alone and tell its own story (another criterion in the competition). But the inclusion of “responses to unanswerable items” may need further explanation and context as we typically strive to exclude unanswerable items from standardized student achievement assessments. Thus, I, at least, was unsure on first viewing the figure what was meant by such item types. In addition, it seems that a large part of the story we may want to tell with a graphic like the one pictured on the cover is how different types of students compare in their responses and response-time trajectories. A more compelling version of this figure may thus show trajectories for two different students, such as a high- and low-performing student, overlaid on the same graph or side by side. What do you think? Please send us your feedback by emailing derek.briggs@colorado.edu.