Nearly a decade ago, we published Pathways to Scientific Teaching (Ebert-May D and Hodder J [Eds]. 2008. Sunderland, MA: Sinauer Associates), a book composed of a series of articles that first appeared in Frontiers, written to help faculty, postdocs, and graduate students to develop a vision of what an active learning classroom looks and sounds like, as well as how it facilitates student learning. In the book, we provided instructional pathways to improve learning in science and to engage students in scientific practices, so that they could gain a deeper understanding of the core ideas in ecology. Each lesson was based on theories of how people learn and how scientists think about and conduct research. Key to each lesson were the learning objectives for students and the assessments that illustrated the degree to which they had acquired knowledge. If ecology instructors shift to more engaging pedagogies, then they must clearly articulate the expected outcomes of their students. So what does it mean when instructors say their students are able to solve problems and think critically, yet the preponderance of assessments (eg exams and quizzes) used in so many courses only ask students to recall information? We argue that the key stepping stone to becoming a reflective teacher is to focus on course assessments. If the final grade in a course is based primarily on assessments, then the subject matter of the course's exam questions is what students will learn. In other words, “If you don't assess what's important, what's assessed becomes important” (L Resnick, pers comm). The best designed instructional materials are especially effective if the instructor reflects upon student outcomes and what the assessments reveal about student thinking. Analyses, interpretations, and improved data collection build confidence in the reliability and accuracy of ecological research. Similarly, how can we, as teachers, be confident about the effectiveness of our teaching if we don't self-critique the exams we write, the assignments we develop, or the questions we ask our students? For both instructors and students, assessment is the arbiter of each lesson, providing evidence of the alignment between learning objectives and the successful attainment of those objectives through instruction. Key to writing assessments is determining what we want students to know and be able to do. Do we want students to think critically at some level? To analyze and interpret data? To predict outcomes of a novel scenario? To make conceptual connections among organisms and their environment? If so, how do we know whether our students have attained these goals? Writing assessments that measure “knowing” by “doing” should provide both faculty and students with insight about their learning. At Michigan State University, we designed a tool to help faculty characterize their current assessments and redesign or write new ones that reliably assess students’ learning in the practices and content of science. The tool is the Three-dimensional Learning Assessment Protocol or 3D-LAP (Laverty et al. 2016; PLoS ONE 11: e0162333), and characterizes scientific practices, core ideas, and crosscutting concepts in the context of an assessment item. Among its many features, 3D-LAP provides a framework for instructors to reflect upon their assessments and improve the quality of individual or clusters of items. Teachers at all levels of experience can use this tool to self-evaluate and better articulate the concepts and practices they are assessing. Who will be the pioneering teachers to inform instruction with assessment data? The onus cannot fall solely on the shoulders of early career instructors or those who are well established. Rather, intergenerational teams of graduate students, postdocs, and faculty can form robust learning communities outside and inside the classroom and make instructional and curricular decisions based on assessment data. Departmental culture and values will influence the support of the intergenerational teams as they pursue complex, long-term educational efforts. In science, progress is built along these same lines. Ecologists at all career stages work collaboratively, complementing one another's skill sets and resources, adapting to the requirements and opportunities available to them at their respective institutions. We have made progress since Pathways to Scientific Teaching was published. The national appeal to improve undergraduate STEM education is diffusing among instructors at different levels of experience. We contend that at the crux of this transformation is the informed self-critique of assessments by instructors. With current tools, teams of instructors can effectively align learning objectives with assessments that authentically measure student performance. Ideally, this is a win–win for students, ecologists, and the discipline as a whole. Diane Ebert-May Department of Plant Biology, Michigan State University, East Lansing, MI Nate Emery Department of Plant Biology, Michigan State University, East Lansing, MI