Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.00795
Cited By
Online Knowledge Distillation via Multi-branch Diversity Enhancement
2 October 2020
Zheng Li
Ying Huang
Defang Chen
Tianren Luo
Ning Cai
Zhigeng Pan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Online Knowledge Distillation via Multi-branch Diversity Enhancement"
11 / 11 papers shown
Title
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
43
0
0
30 Sep 2024
PromptKD: Unsupervised Prompt Distillation for Vision-Language Models
Zheng Li
Xiang Li
Xinyi Fu
Xing Zhang
Weiqiang Wang
Shuo Chen
Jian Yang
VLM
42
35
0
05 Mar 2024
Decoupled Knowledge with Ensemble Learning for Online Distillation
Baitan Shao
Ying Chen
26
0
0
18 Dec 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
34
19
0
22 May 2023
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
33
133
0
29 Nov 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
35
4
0
25 Feb 2022
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
31
2
0
29 Nov 2021
Online Knowledge Distillation for Efficient Pose Estimation
Zheng Li
Jingwen Ye
Xiuming Zhang
Ying Huang
Zhigeng Pan
21
94
0
04 Aug 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Zhuowen Tu
Kaiming He
300
10,233
0
16 Nov 2016
1