Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.11723
Cited By
Revisiting Knowledge Distillation via Label Smoothing Regularization
25 September 2019
Li-xin Yuan
Francis E. H. Tay
Guilin Li
Tao Wang
Jiashi Feng
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Revisiting Knowledge Distillation via Label Smoothing Regularization"
16 / 16 papers shown
Title
Towards Comparable Knowledge Distillation in Semantic Image Segmentation
Onno Niemann
Christopher Vox
Thorben Werner
VLM
25
1
0
07 Sep 2023
SATA: Source Anchoring and Target Alignment Network for Continual Test Time Adaptation
Goirik Chakrabarty
Manogna Sreenivas
Soma Biswas
TTA
41
5
0
20 Apr 2023
In-context Learning Distillation: Transferring Few-shot Learning Ability of Pre-trained Language Models
Yukun Huang
Yanda Chen
Zhou Yu
Kathleen McKeown
27
30
0
20 Dec 2022
Co-training
2
L
2^L
2
L
Submodels for Visual Recognition
Hugo Touvron
Matthieu Cord
Maxime Oquab
Piotr Bojanowski
Jakob Verbeek
Hervé Jégou
VLM
35
9
0
09 Dec 2022
Boosting Graph Neural Networks via Adaptive Knowledge Distillation
Zhichun Guo
Chunhui Zhang
Yujie Fan
Yijun Tian
Chuxu Zhang
Nitesh V. Chawla
21
32
0
12 Oct 2022
Using Knowledge Distillation to improve interpretable models in a retail banking context
Maxime Biehler
Mohamed Guermazi
Célim Starck
62
2
0
30 Sep 2022
Rich Feature Distillation with Feature Affinity Module for Efficient Image Dehazing
S. J.
Anushri Suresh
Nisha J.S.
V. Gopi
VLM
31
6
0
13 Jul 2022
PrUE: Distilling Knowledge from Sparse Teacher Networks
Shaopu Wang
Xiaojun Chen
Mengzhen Kou
Jinqiao Shi
19
2
0
03 Jul 2022
Towards Accurate Human Pose Estimation in Videos of Crowded Scenes
Li Yuan
Shuning Chang
Xuecheng Nie
Ziyuan Huang
Yichen Zhou
Yupeng Chen
Jiashi Feng
Shuicheng Yan
29
15
0
16 Oct 2020
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Wonchul Son
Jaemin Na
Junyong Choi
Wonjun Hwang
25
111
0
18 Sep 2020
Prime-Aware Adaptive Distillation
Youcai Zhang
Zhonghao Lan
Yuchen Dai
Fangao Zeng
Yan Bai
Jie Chang
Yichen Wei
18
40
0
04 Aug 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
Self-Distillation as Instance-Specific Label Smoothing
Zhilu Zhang
M. Sabuncu
20
116
0
09 Jun 2020
Understanding and Improving Knowledge Distillation
Jiaxi Tang
Rakesh Shivanna
Zhe Zhao
Dong Lin
Anima Singh
Ed H. Chi
Sagar Jain
27
129
0
10 Feb 2020
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Tiancheng Wen
Shenqi Lai
Xueming Qian
25
67
0
18 Nov 2019
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1