Designing for learning is a complex task and considered one of the most fundamental activities of teaching practitioners. A well-balanced teaching system ensures that all aspects of teaching, from the intended learning outcomes, the teaching and learning activities used, and the assessment tasks are all associated and aligned to each other (Biggs, 1996). This guarantees appropriate and therefore effective student engagement. The design and promotion of constructively aligned teaching practices has been supported to some degree by the development of software tools that attempt to support teaching practitioners in the design process and assist them in the development of more informed design decisions. Despite the potential of the existing tools, these tools have several limitations in respect of the support and guidance provided and cannot be adapted according to how the design pattern works in practice. Therefore; there is a real need to incorporate an intelligent metric system that enables intelligent design decisions to be made not only theoretically according to pedagogical theories but also practically based on good design practices according to high levels of satisfaction scores. To overcome the limitations of existing design tools, this research explores machine learning techniques; in particular artificial neural networks as an innovative approach for building an Educational Intelligence Design Tool EDIT that supports teaching practitioners to measure, align, and edit their teaching designs based on good design practices and on the pedagogic theory of constructive alignment. Student satisfaction scores are utilized as indicators of good design practice to identify meaningful alignment ranges for the main components of Tepper's metric (2006). It is suggested that modules designed within those ranges will be well-formed and constructively aligned and potentially yield higher student satisfaction. On this basis, the research had developed a substantial module design database with 519 design patterns spanning 476 modules from the STEM discipline. This is considered the first substantial database compared to the state-of-the-art Learning Design Support Environment (LDSE)(Laurillard, 2011), which includes 122 design patterns available. In order to have a neural-based framework for EDIT, a neural auto-encoder was incorporated to act as an auto-associative memory that learns on the basis of exposure to sets of 'good' design patterns. 519 generated design patterns were coded as input criteria and introduced to the designed neural network with feed-forward multilayer perceptron architecture using the IV hyperbolic tangent function and back-propagation training algorithm for learning the desired task. After successful training (88%), the testing phase was followed by presenting 102 new patterns (associated with low student satisfaction) to the network where higher pattern errors were generated suggesting substantial design changes to input patterns had been generated by the network. The findings of the research are significant in showing the degree of changes for the test patterns (before) and (after) and evaluating the relationships between the core features of module designs and overall student satisfaction. T-test analysis results show statistically significant differences in the test set (before) and (after) in case of the alignment score between learning outcomes and learning objectives (V1) and the alignment score between learning objectives and teaching activities (V2), whereas no statistically significant difference is seen in the alignment score between learning outcomes and assessment tasks (V3). The network gives an average improvement of 0.9, 1.5, and 0.5 in the alignment scores of V1, V2, and V3, respectively. This resulted in increasing the average of satisfaction scores from 3.3 to 3.8. Accordingly, positive correlation with different degrees between student satisfaction and the alignment scores were suggested as a result of applying the network proposal changes. EDIT, with its data‐orientated and adaptive approach to design, reveals orthodox practices whilst revealing some unexpected incongruity between alignment theory and design practice. For example, as expected, increasing the amount of questioning, interaction and group‐based activity effects higher levels of student satisfaction even though misalignment may be present. However, the model is relatively ambivalent towards the alignment of learning outcomes and learning objectives suggesting there is some confusion between practitioners as to how these are related. Also, this confusion appears to persist when defining session learning objectives for different types of teaching, learning and assessment tasks in that the activities themselves appear to be at a higher cognitive level according to Bloom's Taxonomy than the respective learning objectives (resulting in positive misalignment).