Background The last 20 years or so has seen a dramatic growth in the demand for professionals trained in information and communication technologies. This growth was partly driven by tremendous technological advances, such as the emergence of computer networking, including wireless networking, graphical user interfaces, and the Internet, the World-Wide Web, and their applications. But the growth in demand was also driven by a much greater realization on the part of many organizations and individuals of the importance of information and communication technologies for their well being, and the more widespread use of these technologies by individuals whose primary area of expertise was not in information technology (Dalhom & Mathiassen, 1977; Denning, 2001a; Denning 2001b). While programs in traditional computing disciplines such as computer engineering and computer science responded to these new trends, albeit often slowly, employers started to demand skills that graduates typically did not get out of a traditional computer science or computer engineering program. For example, many traditional computer science programs did not equip their graduates with the practical network or system administration skills that organizations needed to expand and maintain their IT infrastructures, or the web development skills required to take advantage of the many opportunities opened up by the Internet. Moreover, because of the more theoretical emphasis of computer science programs, graduates often did not acquire a sufficient understanding of organizational processes to be able to support IT applications from a user or organizational perspective (Denning, 2004). While programs in Information Systems are able to offer material to their students that allow them to develop a better understanding of organizations and the way in which IT applications can support them, they often do so at the expense of a sufficiently thorough coverage of the technology. This has been exacerbated by the fact that a good number of programs in Information Systems were offered in business schools and were limited in the number of courses they could offer in IT because of accreditation standards for business schools in place at the time. It was in this environment that a number of universities started to offer undergraduate programs in IT. While there was, and still is, far less homogeneity among undergraduate programs in IT than there is among programs in computer science or information systems, all IT programs show a family resemblance to each other. Most cover areas such as networking, web development and system administration in some detail, while very few pay particular attention to the theoretical foundations in complexity theory that is so prevalent in computer science programs. Until fairly recently, most IT programs arose in isolation from each other, often evolving out of existing programs in computer science, computer engineering or computer engineering technology, or information systems (Lunt et al., 2003a; Reichgelt et al., 2004). However, there was also a growing realization on the part of those programs that were aware of each other's existence that a forum was needed in which they could discuss the various features of their programs, and improve their programs based on the experiences of IT programs at other institutions (Lunt et al., 2003b). Thus, in early 2001, a steering committee composed of five universities offering or getting ready to offer IT programs compiled a list of IT programs across the United States with the intention of organizing an invitation-only conference. These efforts came to fruition in December 2001, when representatives from 15 colleges/universities attended the first Conference for IT Curriculum (CITC-1) in Provo, Utah. The primary aims of this conference were to establish a national organization of IT educators and begin to establish academic standards for this rapidly emerging discipline (Lunt et al. …
Read full abstract