Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1908.05474
Cited By
Adaptive Regularization of Labels
15 August 2019
Qianggang Ding
Sifan Wu
Hao Sun
Jiadong Guo
Shutao Xia
ODL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Adaptive Regularization of Labels"
11 / 11 papers shown
Title
Function-Consistent Feature Distillation
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
49
18
0
24 Apr 2023
Smooth and Stepwise Self-Distillation for Object Detection
Jieren Deng
Xiaoxia Zhou
Hao Tian
Zhihong Pan
Derek Aguiar
ObjD
34
0
0
09 Mar 2023
Knowledge Distillation on Graphs: A Survey
Yijun Tian
Shichao Pei
Xiangliang Zhang
Chuxu Zhang
Nitesh Chawla
26
28
0
01 Feb 2023
Knowledge Distillation of Transformer-based Language Models Revisited
Chengqiang Lu
Jianwei Zhang
Yunfei Chu
Zhengyu Chen
Jingren Zhou
Fei Wu
Haiqing Chen
Hongxia Yang
VLM
27
10
0
29 Jun 2022
Robust Cross-Modal Representation Learning with Progressive Self-Distillation
A. Andonian
Shixing Chen
Raffay Hamid
VLM
31
54
0
10 Apr 2022
Controlling the Quality of Distillation in Response-Based Network Compression
Vibhas Kumar Vats
David J. Crandall
23
1
0
19 Dec 2021
Isotonic Data Augmentation for Knowledge Distillation
Wanyun Cui
Sen Yan
29
7
0
03 Jul 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,857
0
09 Jun 2020
On the Inference Calibration of Neural Machine Translation
Shuo Wang
Zhaopeng Tu
Shuming Shi
Yang Liu
19
80
0
03 May 2020
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Tiancheng Wen
Shenqi Lai
Xueming Qian
25
68
0
18 Nov 2019
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
1