ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.00795
  4. Cited By
Online Knowledge Distillation via Multi-branch Diversity Enhancement

Online Knowledge Distillation via Multi-branch Diversity Enhancement

2 October 2020
Zheng Li
Ying Huang
Defang Chen
Tianren Luo
Ning Cai
Zhigeng Pan
ArXivPDFHTML

Papers citing "Online Knowledge Distillation via Multi-branch Diversity Enhancement"

11 / 11 papers shown
Title
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
43
0
0
30 Sep 2024
PromptKD: Unsupervised Prompt Distillation for Vision-Language Models
PromptKD: Unsupervised Prompt Distillation for Vision-Language Models
Zheng Li
Xiang Li
Xinyi Fu
Xing Zhang
Weiqiang Wang
Shuo Chen
Jian Yang
VLM
42
35
0
05 Mar 2024
Decoupled Knowledge with Ensemble Learning for Online Distillation
Decoupled Knowledge with Ensemble Learning for Online Distillation
Baitan Shao
Ying Chen
26
0
0
18 Dec 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge
  Distillation?
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
34
19
0
22 May 2023
Curriculum Temperature for Knowledge Distillation
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
33
133
0
29 Nov 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Learn From the Past: Experience Ensemble Knowledge Distillation
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
35
4
0
25 Feb 2022
Improved Knowledge Distillation via Adversarial Collaboration
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
31
2
0
29 Nov 2021
Online Knowledge Distillation for Efficient Pose Estimation
Online Knowledge Distillation for Efficient Pose Estimation
Zheng Li
Jingwen Ye
Xiuming Zhang
Ying Huang
Zhigeng Pan
21
94
0
04 Aug 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Zhuowen Tu
Kaiming He
300
10,233
0
16 Nov 2016
1