Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2505.15133
Cited By
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
21 May 2025
Haiduo Huang
Jiangcheng Song
Yadong Zhang
Pengju Ren
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer"
2 / 2 papers shown
Title
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Chuanguang Yang
Xinqiang Yu
Han Yang
Zhulin An
Chengqing Yu
Libo Huang
Yongjun Xu
108
2
0
22 Feb 2025
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
238
1,057
0
23 Oct 2019
1