Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1805.05551
Cited By
Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students
15 May 2018
Chenglin Yang
Lingxi Xie
Siyuan Qiao
Alan Yuille
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students"
26 / 26 papers shown
Title
Distilled Transformers with Locally Enhanced Global Representations for Face Forgery Detection
Yaning Zhang
Qiufu Li
Zitong Yu
Linlin Shen
ViT
69
3
0
31 Dec 2024
Robust Preference Optimization through Reward Model Distillation
Adam Fisch
Jacob Eisenstein
Vicky Zayats
Alekh Agarwal
Ahmad Beirami
Chirag Nagpal
Peter Shaw
Jonathan Berant
81
22
0
29 May 2024
EnSiam: Self-Supervised Learning With Ensemble Representations
Kai Han
Minsik Lee
SSL
24
0
0
22 May 2023
Smooth and Stepwise Self-Distillation for Object Detection
Jieren Deng
Xiaoxia Zhou
Hao Tian
Zhihong Pan
Derek Aguiar
ObjD
34
0
0
09 Mar 2023
A Scalable and Efficient Iterative Method for Copying Machine Learning Classifiers
N. Statuto
Irene Unceta
Jordi Nin
O. Pujol
35
0
0
06 Feb 2023
Decentralized Learning with Multi-Headed Distillation
A. Zhmoginov
Mark Sandler
Nolan Miller
Gus Kristiansen
Max Vladymyrov
FedML
40
4
0
28 Nov 2022
Distilling Representations from GAN Generator via Squeeze and Span
Yu Yang
Xiaotian Cheng
Chang-rui Liu
Hakan Bilen
Xiang Ji
GAN
33
0
0
06 Nov 2022
Disentangle and Remerge: Interventional Knowledge Distillation for Few-Shot Object Detection from A Conditional Causal Perspective
Jiangmeng Li
Yanan Zhang
Jingyao Wang
Hui Xiong
Chengbo Jiao
Xiaohui Hu
Changwen Zheng
Gang Hua
CML
42
28
0
26 Aug 2022
FS-BAN: Born-Again Networks for Domain Generalization Few-Shot Classification
Yunqing Zhao
Ngai-man Cheung
BDL
25
12
0
23 Aug 2022
Boosting the Adversarial Transferability of Surrogate Models with Dark Knowledge
Dingcheng Yang
Zihao Xiao
Wenjian Yu
AAML
36
5
0
16 Jun 2022
Crowd Localization from Gaussian Mixture Scoped Knowledge and Scoped Teacher
Juncheng Wang
Junyuan Gao
Yuan. Yuan
Qi. Wang
40
18
0
12 Jun 2022
MISSU: 3D Medical Image Segmentation via Self-distilling TransUNet
Nan Wang
Shaohui Lin
Xiaoxiao Li
Ke Li
Yunhang Shen
Yue Gao
Lizhuang Ma
ViT
MedIm
49
33
0
02 Jun 2022
Reducing Flipping Errors in Deep Neural Networks
Xiang Deng
Yun Xiao
Bo Long
Zhongfei Zhang
AAML
38
4
0
16 Mar 2022
Class-Incremental Continual Learning into the eXtended DER-verse
Matteo Boschini
Lorenzo Bonicelli
Pietro Buzzega
Angelo Porrello
Simone Calderara
CLL
BDL
32
128
0
03 Jan 2022
Language Modelling via Learning to Rank
A. Frydenlund
Gagandeep Singh
Frank Rudzicz
47
7
0
13 Oct 2021
One Timestep is All You Need: Training Spiking Neural Networks with Ultra Low Latency
Sayeed Shafayet Chowdhury
Nitin Rathi
Kaushik Roy
23
40
0
01 Oct 2021
Linking Common Vulnerabilities and Exposures to the MITRE ATT&CK Framework: A Self-Distillation Approach
Benjamin Ampel
Sagar Samtani
Steven Ullman
Hsinchun Chen
25
35
0
03 Aug 2021
Isotonic Data Augmentation for Knowledge Distillation
Wanyun Cui
Sen Yan
29
7
0
03 Jul 2021
Scalable Transformers for Neural Machine Translation
Peng Gao
Shijie Geng
Ping Luo
Xiaogang Wang
Jifeng Dai
Hongsheng Li
31
13
0
04 Jun 2021
Towards Practical Lipreading with Distilled and Efficient Models
Pingchuan Ma
Brais Martínez
Stavros Petridis
Maja Pantic
26
95
0
13 Jul 2020
Robust Re-Identification by Multiple Views Knowledge Distillation
Angelo Porrello
Luca Bergamini
Simone Calderara
32
65
0
08 Jul 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,857
0
09 Jun 2020
Self-Distillation as Instance-Specific Label Smoothing
Zhilu Zhang
M. Sabuncu
20
116
0
09 Jun 2020
Circumventing Outliers of AutoAugment with Knowledge Distillation
Longhui Wei
Anxiang Xiao
Lingxi Xie
Xin Chen
Xiaopeng Zhang
Qi Tian
24
62
0
25 Mar 2020
Self-Distillation Amplifies Regularization in Hilbert Space
H. Mobahi
Mehrdad Farajtabar
Peter L. Bartlett
35
229
0
13 Feb 2020
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
274
5,330
0
05 Nov 2016
1