Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2204.05547
Cited By
DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization
12 April 2022
XueQing Deng
Dawei Sun
Shawn D. Newsam
Peng Wang
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization"
6 / 6 papers shown
Title
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
107
0
0
09 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
103
0
0
13 Jan 2025
Adaptive Teaching with Shared Classifier for Knowledge Distillation
Jaeyeon Jang
Young-Ik Kim
Jisu Lim
Hyeonseong Lee
52
0
0
12 Jun 2024
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
116
71
0
23 May 2023
PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation
Reyhan Kevser Keser
Aydin Ayanzadeh
O. A. Aghdam
Çaglar Kilcioglu
B. U. Toreyin
N. K. Üre
62
7
0
26 Feb 2021
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
275
1,059
0
23 Oct 2019
1