Abstract

In the Fall of 2021, the University of Saskatchewan’s College of Engineering implemented a new first year Engineering Design course called GE 142 (Design I). In comparison to similar courses in other Engineering programs, the course was unique in a few respects. First, it ran from mid-October to mid-December, and it included 7 lectures and 4 labs. Second, it was focused almost entirely on problem definition. Third, the assessment system was competency based. Each of these elements made for a unique design course, and each element will be described in detail.
 The course had a number of Learning Outcome goals in the general areas of knowledge, skills, experiences, and attitudes. Knowledge was assessed using an automated adaptive quiz system employing Mobius™ software, linked to the Canvas™ Learning Management System (LMS). Design skills were assessed through a series of six assignments that focused on the ability to characterize design problems, maintain an effective logbook, make a convincing case to undertake a design problem, communicate in a clear manner, and reflect on how to improve design practice. Experiences included various types of design exercises conducted in lab settings. For example, some design exercises were more open-ended while others were more closed design problems, and students also engaged in the characterization of a design problem with a live client. Assessment of attitudes was carried out at the end of the course using a series of Likert-scale questions that probed students’ perspectives on the value of design, their enjoyment of design, the value of logbooks, their interest in tech innovation, and the importance of group dynamics, project management, and technical communication.
 As a quality improvement/program assessment exercise, an analysis of grades and student attitudes was conducted and will be presented (n=306). As well, an initial analysis of the performance of students on the Mobius questions was carried out. In general, results were quite favourable both in terms of achievements against different types of Learning Outcomes and in terms of student attitudes towards various perspectives in Design. Student responses for the attitude survey were anonymous and all grade and quiz analyses employed aggregate data. At the end of the course, instructors reflected on what they felt should be continued, started, and stopped in subsequent iterations of the course. The suitability of the student performance data against the Learning Outcomes will also be discussed in the context of accreditation criteria for the CEAB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call