The 1990s have emerged as an era for re-asking fundamental questions about the productivity and efficiency of American enterprises -- a change in national mood and circumstance that increasingly has come to dominate discussions about American colleges and universities. For the first time since the upheavals of the 1960s and the Vietnam War, American higher education has found itself on the defensive, having to explain that colleges and universities are neither privileged havens of waste nor institutions so out of touch with reality that they are on the verge of losing their relevance. In less than three years, colleges and universities have moved from the ambivalent affluence of the 1980s into an era of resource constraints and nettlesome public scrutiny. Public funding for higher education has declined in absolute terms and, more important for the long-term future of colleges and universities, as a share of public appropriations. Public as well as private institutions have found themselves in the uncomfortable position of having to decrease expenditures per student while simultaneously increasing tuition at a rate that exceeds the cost of living. These actions have made clear what many have long suspected: that students are being asked to pay more for less |1, 6~. The result is that both friends and critics of American higher education are asking increasingly tough questions about the enterprise. How do colleges and universities spend their money? How are priorities determined? Are new revenues the only way to fund new programs? What explains the dramatic increase in administrative costs? And the question that now lodges at the center of public scrutiny: To what extent are faculty attitudes and behaviors responsible for higher education's inability to control costs and establish institutional priorities? This latter question lies at the heart of our research. Seeking a new understanding of the costs associated with the provision of an undergraduate education, our work is based on two observations about the functioning of American colleges and universities. The first is that there has been an incipient destructuring -- or deconstructing -- of the undergraduate curriculum over the last two decades that has resulted in fewer required courses, less emphasis on taking courses in an ordered sequence, and greater reliance on students to develop their own sense of how the various bits and pieces of knowledge they acquire in the classroom fit together into a coherent picture |9, 10~. We believe that this destructuring derives in part from the faculty's own pursuit of specialized knowledge, in part from the desire of faculty members as individuals to avoid responsibility for either the quality or the scope of one another's teaching, and in part from economic pressures that emphasize filling classes with enrolled students. We have become convinced, moreover, that the destructuring of the curriculum has had important economic consequences in terms of course proliferation and the need to hire a larger faculty. Our second observation concerns the functioning of the academic ratchet, whereby individual faculty members increase their discretionary time (time for pursuing professional and personal goals) largely by loosening their institutional ties and responsibilities |4, 5, 11~. Our proposition is that as faculty place greater value on discretionary time, undergraduate teaching is accorded less importance. Put simply, those hours not used for teaching courses, for grading papers, or for meeting with students become available for research and scholarship, for consulting and other professional activities, and in most research universities, for specialized teaching at the graduate level. Institutional rhetoric about the importance of teaching notwithstanding, we believe that the reductions in discretionary time associated with more and better teaching usually are not compensated by additional salary or other rewards, whereas success or failure in regard to other obligations carry significant rewards and penalties. …