20
0

Contrastive Cross-Course Knowledge Tracing via Concept Graph Guided Knowledge Transfer

Abstract

Knowledge tracing (KT) aims to predict learners' future performance based on historical learning interactions. However, existing KT models predominantly focus on data from a single course, limiting their ability to capture a comprehensive understanding of learners' knowledge states. In this paper, we propose TransKT, a contrastive cross-course knowledge tracing method that leverages concept graph guided knowledge transfer to model the relationships between learning behaviors across different courses, thereby enhancing knowledge state estimation. Specifically, TransKT constructs a cross-course concept graph by leveraging zero-shot Large Language Model (LLM) prompts to establish implicit links between related concepts across different courses. This graph serves as the foundation for knowledge transfer, enabling the model to integrate and enhance the semantic features of learners' interactions across courses. Furthermore, TransKT includes an LLM-to-LM pipeline for incorporating summarized semantic features, which significantly improves the performance of Graph Convolutional Networks (GCNs) used for knowledge transfer. Additionally, TransKT employs a contrastive objective that aligns single-course and cross-course knowledge states, thereby refining the model's ability to provide a more robust and accurate representation of learners' overall knowledge states.

View on arXiv
@article{han2025_2505.13489,
  title={ Contrastive Cross-Course Knowledge Tracing via Concept Graph Guided Knowledge Transfer },
  author={ Wenkang Han and Wang Lin and Liya Hu and Zhenlong Dai and Yiyun Zhou and Mengze Li and Zemin Liu and Chang Yao and Jingyuan Chen },
  journal={arXiv preprint arXiv:2505.13489},
  year={ 2025 }
}
Comments on this paper