Every field needs a history to distinguish it from other areas of endeavor, and every field needs to recognize and formally acknowledge those individuals who have contributed significant ideas to the establishment of that field. It is important to understand how basic ideas shape what we do today and affect our thinking about the future. Much of what was embedded in the design of early systems is now accepted as a matter of practice or convenience. History can be of value if we view fundamental ideas in the context of new technological advances and use the best of those ideas to design new systems to take advantage of new opportunities. The Beginnings The first large-scale computers ever made were designed and built on the campuses of American universities. The first operational computer, MARK 1, was put into use at Harvard in 1944, and utilized electromechanical components to perform elementary arithmetic. Later, in 1946, the University of Pennsylvania developed the first electronic computer, ENIAC. So if we use ENIAC as the founding of the modern computer, as many do, computers are but 44 years old. Establishing the precise beginning of computers used for educational purposes is difficult since many early applications were merely demonstrations to show the potential for computers in education. In 1958 IBM demonstrated the teaching of binary arithmetic by computer. About this time, System Development Corp. developed a project called CLASS for computer-based teaching. And in 1959, the PLATO (Programmed Logic for Automatic Teaching Operations) computer-assisted instruction project was begun at the University of Illinois. So, if we use PLATO as the start of computers used for instructional purposes, educational computing is but 31 years old. While some federal programs in the late 1950s and early 1960s supported projects on computers in education, most were directed toward the more general goals of scientific research. In order to assess the value of academic computing, several national commissions were established. In 1966, a panel chaired by J. Barkley Rosser prepared a National Academy of Sciences report, Digital Computer Needs in Universities and Colleges. It made a strong case for university access to computers for research, but said little about education. In 1967 a new committee was established, the President's Science Advisory Committee (PSAC), to study computers in higher education. PSAC, chaired by John R. Pierce of Bell Laboratories, concluded that an undergraduate college education without adequate computing was as deficient as an undergraduate education would be without an adequate library. PSAC also acknowledged the value of computers for pre-college education. In response to these reports, President Lyndon Johnson directed the National Science Foundation (NSF) to work with the U.S. Office of Education to establish an experimental program for developing the potential of computers in education. In July 1967, as a result of this presidential directive, the National Science Foundation established the Office of Computing Activities to provide federal leadership in the use of computers for research and education. Later the directive was added as a statutory requirement to the NSF charter. So the federal role for computers in education began 23 years ago. Integrating Into Curriculum By 1950, there were only 12 computers in the United States. At that time, commercial investors were uninterested in computers; those who knew felt that the total market for such machines would not exceed a dozen. Selling the potential of computers to the financial and business communities was difficult, but persuading educators that computers had a role in the educational process was equally challenging. While computers were used sparingly in research and occasionally in the classroom, the idea that computing was a vital and necessary part of education was still a novel one. …
Read full abstract