13
2

Denoising Programming Knowledge Tracing with a Code Graph-based Tuning Adaptor

Main:9 Pages
7 Figures
Bibliography:2 Pages
4 Tables
Appendix:1 Pages
Abstract

Programming Knowledge Tracking (PKT) aims to dynamically diagnose learners' mastery levels of programming knowledge based on their coding activities, facilitating more effective and personalized programming education. However, current PKT studies primarily focus on the implicit relationship between code content and knowledge assessment, often overlooking two types of noise signals in long-term programming activities: unwanted signals from unrelated submissions and weak signals from minor modifications. This practical challenge significantly limits model performance and application. To address this issue, we propose Coda, a Code graph-based tuning adaptor designed to enhance existing PKT models by identifying and mitigating the impact of noise. Specifically, Coda first transforms the loose code sequences submitted by each learner into a compact code graph. By leveraging this code graph, unwanted signals can be identified from a semantic similarity perspective. We then apply a cluster-aware GCN to the code graph, which improves the discrimination of weak signals and enables their clustering for identification. Finally, a lightweight yet effective adaptor is incorporated into the PKT task through optimization with two noise feature-based constraints and a navigational regularization term, to correct knowledge states affected by noise. It is worth mentioning that the Coda framework is model-agnostic and can be adapted to most existing PKT solutions. Extensive experimental results on four real-world datasets demonstrate that Coda effectively performs the PKT task in the presence of noisy programming records, outperforming typical baselines.

View on arXiv
@article{gao2025_2506.11107,
  title={ Denoising Programming Knowledge Tracing with a Code Graph-based Tuning Adaptor },
  author={ Weibo Gao and Qi Liu and Rui Li and Yuze Zhao and Hao Wang and Linan Yre and Fangzhou Yao and Zheng Zhang },
  journal={arXiv preprint arXiv:2506.11107},
  year={ 2025 }
}
Comments on this paper