22
0

Education distillation:getting student models to learn in shcools

Abstract

This paper introduces a new knowledge distillation method, called education distillation (ED), which is inspired by the structured and progressive nature of human learning. ED mimics the educational stages of primary school, middle school, and university and designs teaching reference blocks. The student model is split into a main body and multiple teaching reference blocks to learn from teachers step by step. This promotes efficient knowledge distillation while maintaining the architecture of the student model. Experimental results on the CIFAR100, Tiny Imagenet, Caltech and Food-101 datasets show that the teaching reference blocks can effectively avoid the problem of forgetting. Compared with conventional single-teacher and multi-teacher knowledge distillation methods, ED significantly improves the accuracy and generalization ability of the student model. These findings highlight the potential of ED to improve model performance across different architectures and datasets, indicating its value in various deep learning scenarios. Code examples can be obtained at:this https URL.

View on arXiv
@article{feng2025_2311.13811,
  title={ Education distillation:getting student models to learn in shcools },
  author={ Ling Feng and Tianhao Wu and Xiangrong Ren and Zhi Jing and Xuliang Duan },
  journal={arXiv preprint arXiv:2311.13811},
  year={ 2025 }
}
Comments on this paper