Abstract

Knowledge tracing(KT) is a hot topic in computer-supported education. It’s a task that can model a sequences of learning activities that students engage with, so as to trace the students’ continuously developing state of knowledge. One important purpose of knowledge tracing is that it can evaluate students’ mastery of knowledge concepts(KCs) according to their performance in learning activities, in order to customize personalized learning plans for students to help them get better learning plans. In recent years, many knowledge tracing methods have been proposed, which can be divided into Recurrent Neural Networks(RNN) based methods and self-attention based methods. Methods based on RNN include Deep Knowledge Tracing(DKT), Dynamic Key-Value Memory Network(DKVMN), etc. These RNN methods have better performance than traditional knowledge tracing methods. However, they are not suitable for dealing with sparse data, that is, situations where students interact with few KCs. The Self-Attention model for Knowledge Tracing(SAKT) uses the self-attention mechanism, which can identify the KCs from the learning activities that students participated in and are related to a given KC, so as to predict the mastery of the given KC. Therefore, SAKT can deal with the sparse data problem that the RNN methods are not suitable for processing. LightGBM is a gradient boosting framework that uses a tree based learning algorithm and is one of the current mainstream machine learning algorithms. In order to make the prediction more accurate, we propose the BSLKT model, which fuses the SAKT model and the LightGBM model in bagging manner. In addition, we performed feature engineering preprocessing on the experiment dataset. The experiment compared the accuracy of SAKT model, LightGBM model and the BSLKT model and the results show that BSLKT can achieve better prediction performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call