Will mechanisms that policy makers are using to improve schools of education - more certification tests, higher cut scores, and severe penalties for institutions that fail to meet specific pass rates - actually deliver increased accountability and better teachers that policy makers have promised? Mr. Fowler's experience in Massachusetts suggests that answer is no. DURING summer of 1998, magazines, newspapers, and electronic media widely reported results of first administration of Massachusetts' new tests for aspiring teachers. The results were notable because 59% of those tested failed an exam that state officials described as a test of eighth- to 10th-grade This story was also notable because of remarkable series of events that accompanied announcement of test scores. The state board of education sparked outrage when it initially set cut scores at one standard deviation below levels recommended by its own panels of experts. It sparked controversy when it reversed this decision just one week later, in a vote preceded by unexpected departure of Frank Haydu, then commissioner of education, who resigned with comment that the political forces have been unleashed.1 This story also made headlines because of heated and highly quotable rhetoric that surrounded affair. Thomas Finneran, speaker of Massachusetts House of Representatives, declared: I'll tell you who won't be a teacher. The idiots who took that test and flunked so miserably - and, of course, idiots who passed them.2 At heart of this summerlong brouhaha, though, was oft-repeated statement that nearly 60% of aspiring teachers had failed a test of basic skills. As this new fact spread across country, U.S. Sen. Jeff Bingaman (D-N.M.) declared results of Massachusetts tests an unflattering snapshot of state of teacher preparation in America.3 This snapshot has had a powerful impact. It launched scores of op-ed articles, helped shape Higher Education Act of 1998, and pushed teacher quality to top of nation's agenda. This picture had such impact because many perceived in it principal reason for purported of American education: our students aren't learning enough because our teachers don't know enough. For example, when editorialists in Georgia, writing in Augusta Chronicle, asked, Wonder why Johnny can't read, write, or add? they answered by pointing to results of Massachusetts teacher tests. But how accurate is this snapshot that led policy makers and commentators to draw such sweeping conclusions? According to official data and documents, this now three-year-old picture was not at all accurate. Rather, it presented a grossly distorted view of aspiring teachers and of teacher preparation programs. In fact, much higher percentages of candidates have since passed Massachusetts teacher tests, now called Massachusetts Educator Certification Tests (MECT). What's more, passing MECT reflects a higher level of knowledge and skills than politicians have admitted. Finally, because there are serious questions about quality of tests that produced original snapshot, it is impossible to know exactly what results of Massachusetts tests have to say about teacher preparation in Massachusetts - let alone in America. How Many Have Passed MECT? To qualify for certification, aspiring Massachusetts teachers must pass two separate four-hour exams: a subject-matter exam and a two-part literacy exam (composed of a reading test and a writing test). When reporters mentioned passing rate on first administration of MECT, they were referring to candidates who had taken and passed, in one eight-hour sitting, three distinct examinations in reading, writing, and a subject area. And when they referred to failure rate, they were referring to candidates who had taken all three tests in one day but had not passed them all. …