Abstract

Maker education mainly involves “hands-on” as the core concept and combines various educational theories to redefine interactions between learners and teachers in a learning environment. Identification of meaningful “hands-on” behaviors is crucial to evaluate students’ learning performance, although an instructor’s observation of every student is not feasible. However, such observation is possible with the aid of the artificial intelligence (AI) image processing technique; the AI learning behavior recognition system can serve as the second eyes of teachers, thus accounting for individual differences. However, in previous studies, learning behavior recognition was applied to the traditional or static classroom. A behavior recognition system for identifying “hands-on” actions in the learning context has still not been developed. Therefore, this study designed a human posture evaluation system, obtained human articulation node information from learning field images, and built a learning behavior recognition model suitable for maker education based on the AI convolutional neural network (CNN). A learning behavior model was defined, along with a number of student behavior indexes. Subsequently, the effectiveness of the model and behavior indexes was verified through practical learning activities. The model evaluation results indicated that the proposed model achieved a training accuracy of 0.99 and a model accuracy of 0.83. Thus, the model can be applied to dynamic maker activity learning environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.