Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2206.14532
Cited By
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
29 June 2022
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?"
10 / 10 papers shown
Title
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Instance Temperature Knowledge Distillation
Zhengbo Zhang
Yuxi Zhou
Jia Gong
Jun Liu
Zhigang Tu
21
2
0
27 Jun 2024
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
29
19
0
22 May 2023
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
18
17
0
18 May 2023
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
22
132
0
29 Nov 2022
FS-BAN: Born-Again Networks for Domain Generalization Few-Shot Classification
Yunqing Zhao
Ngai-man Cheung
BDL
21
12
0
23 Aug 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
37
46
0
28 May 2022
AlphaNet: Improved Training of Supernets with Alpha-Divergence
Dilin Wang
Chengyue Gong
Meng Li
Qiang Liu
Vikas Chandra
147
44
0
16 Feb 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
234
190
0
12 Jan 2021
Bag of Tricks for Image Classification with Convolutional Neural Networks
Tong He
Zhi-Li Zhang
Hang Zhang
Zhongyue Zhang
Junyuan Xie
Mu Li
221
1,399
0
04 Dec 2018
1