Abstract

In intelligent education systems, one fundamental task is to predict student performance on new exercises and estimate the knowledge proficiency of students on knowledge concepts. Existing prediction methods are mainly constructed based on the classical cognitive diagnosis framework MIRT, where student performance on exercises are modeled as the interaction results of exercises’ trait vectors and students’ knowledge proficiency. The trait vector learning of exercises has a big effect on the estimation results of the knowledge proficiency. However, when learning the trait vectors, existing methods cannot exploit the rich contents of cross-modal exercises that are closely related to the traits of exercises. This makes it difficult for these methods to best cope with common cross-modal exercises. Besides, existing methods overlook the intrinsic complexity of examined concepts, which actually affects exercise traits, such as exercise difficulty. To address these issues, we propose a deep Cross-Modal Neural Cognitive Diagnosis framework (CMNCD), which mainly has two appealing advantages: (i) By extending MIRT under the framework of deep neural networks, CMNCD can effectively explore the fine-grained semantic information in the cross-modal contents of exercises for modeling student performance. (ii) CMNCD investigates the complexity of examined concepts based on the prerequisite relationships among concepts and incorporates it into the learning of exercises’ trait vectors. Extensive experiments on several real-world datasets show that our CMNCD outperforms state-of-the-art cognitive diagnosis methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.