ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.00350
  4. Cited By
Online Knowledge Distillation with Diverse Peers

Online Knowledge Distillation with Diverse Peers

1 December 2019
Defang Chen
Jian-Ping Mei
Can Wang
Yan Feng
Chun-Yen Chen
    FedML
ArXivPDFHTML

Papers citing "Online Knowledge Distillation with Diverse Peers"

50 / 51 papers shown
Title
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
48
0
0
06 Jan 2025
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
46
0
0
30 Sep 2024
Online Multi-level Contrastive Representation Distillation for
  Cross-Subject fNIRS Emotion Recognition
Online Multi-level Contrastive Representation Distillation for Cross-Subject fNIRS Emotion Recognition
Zhili Lai
Chunmei Qing
Junpeng Tan
Wanxiang Luo
Xiangmin Xu
28
1
0
24 Sep 2024
Enhancing Accuracy and Parameter-Efficiency of Neural Representations
  for Network Parameterization
Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization
Hongjun Choi
Jayaraman J. Thiagarajan
Ruben Glatt
Shusen Liu
51
0
0
29 Jun 2024
DistilDoc: Knowledge Distillation for Visually-Rich Document Applications
DistilDoc: Knowledge Distillation for Visually-Rich Document Applications
Jordy Van Landeghem
Subhajit Maity
Ayan Banerjee
Matthew Blaschko
Marie-Francine Moens
Josep Lladós
Sanket Biswas
52
2
0
12 Jun 2024
Decoupled Knowledge with Ensemble Learning for Online Distillation
Decoupled Knowledge with Ensemble Learning for Online Distillation
Baitan Shao
Ying Chen
26
0
0
18 Dec 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
23
16
0
08 Aug 2023
Frameless Graph Knowledge Distillation
Frameless Graph Knowledge Distillation
Dai Shi
Zhiqi Shao
Yi Guo
Junbin Gao
39
4
0
13 Jul 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge
  Distillation?
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
34
19
0
22 May 2023
Generalization Matters: Loss Minima Flattening via Parameter
  Hybridization for Efficient Online Knowledge Distillation
Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation
Tianli Zhang
Mengqi Xue
Jiangtao Zhang
Haofei Zhang
Yu Wang
Lechao Cheng
Mingli Song
Mingli Song
30
5
0
26 Mar 2023
LightTS: Lightweight Time Series Classification with Adaptive Ensemble
  Distillation -- Extended Version
LightTS: Lightweight Time Series Classification with Adaptive Ensemble Distillation -- Extended Version
David Campos
Miao Zhang
B. Yang
Tung Kieu
Chenjuan Guo
Christian S. Jensen
AI4TS
45
47
0
24 Feb 2023
Supervision Complexity and its Role in Knowledge Distillation
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
32
12
0
28 Jan 2023
BD-KD: Balancing the Divergences for Online Knowledge Distillation
BD-KD: Balancing the Divergences for Online Knowledge Distillation
Ibtihel Amara
N. Sepahvand
B. Meyer
W. Gross
J. Clark
32
2
0
25 Dec 2022
Curriculum Temperature for Knowledge Distillation
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
33
134
0
29 Nov 2022
Accelerating Diffusion Sampling with Classifier-based Feature
  Distillation
Accelerating Diffusion Sampling with Classifier-based Feature Distillation
Wujie Sun
Defang Chen
Can Wang
Deshi Ye
Yan Feng
Chun-Yen Chen
35
16
0
22 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
29
35
0
28 Oct 2022
Online Cross-Layer Knowledge Distillation on Graph Neural Networks with
  Deep Supervision
Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
Jiongyu Guo
Defang Chen
Can Wang
22
3
0
25 Oct 2022
Online Knowledge Distillation via Mutual Contrastive Learning for Visual
  Recognition
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
46
50
0
23 Jul 2022
Large-scale Knowledge Distillation with Elastic Heterogeneous Computing
  Resources
Large-scale Knowledge Distillation with Elastic Heterogeneous Computing Resources
Ji Liu
Daxiang Dong
Xi Wang
An Qin
Xingjian Li
P. Valduriez
Dejing Dou
Dianhai Yu
36
6
0
14 Jul 2022
Multi scale Feature Extraction and Fusion for Online Knowledge
  Distillation
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
37
3
0
16 Jun 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
37
46
0
28 May 2022
Overcoming Catastrophic Forgetting in Incremental Object Detection via
  Elastic Response Distillation
Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation
Tao Feng
Mang Wang
Hangjie Yuan
ObjD
CLL
26
84
0
05 Apr 2022
Self-Distillation from the Last Mini-Batch for Consistency
  Regularization
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
24
62
0
30 Mar 2022
Knowledge Distillation with the Reused Teacher Classifier
Knowledge Distillation with the Reused Teacher Classifier
Defang Chen
Jianhan Mei
Hailin Zhang
C. Wang
Yan Feng
Chun-Yen Chen
36
167
0
26 Mar 2022
Channel Self-Supervision for Online Knowledge Distillation
Channel Self-Supervision for Online Knowledge Distillation
Shixi Fan
Xuan Cheng
Xiaomin Wang
Chun Yang
Pan Deng
Minghui Liu
Jiali Deng
Meilin Liu
18
1
0
22 Mar 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Learn From the Past: Experience Ensemble Knowledge Distillation
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
35
4
0
25 Feb 2022
Collaborative Training of Heterogeneous Reinforcement Learning Agents in
  Environments with Sparse Rewards: What and When to Share?
Collaborative Training of Heterogeneous Reinforcement Learning Agents in Environments with Sparse Rewards: What and When to Share?
Alain Andres
Esther Villar-Rodriguez
Javier Del Ser
22
9
0
24 Feb 2022
Knowledge Distillation with Deep Supervision
Knowledge Distillation with Deep Supervision
Shiya Luo
Defang Chen
Can Wang
21
1
0
16 Feb 2022
Data-Free Knowledge Transfer: A Survey
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
37
48
0
31 Dec 2021
Safe Distillation Box
Safe Distillation Box
Jingwen Ye
Yining Mao
Mingli Song
Xinchao Wang
Cheng Jin
Xiuming Zhang
AAML
24
13
0
05 Dec 2021
Improved Knowledge Distillation via Adversarial Collaboration
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
31
2
0
29 Nov 2021
EvDistill: Asynchronous Events to End-task Learning via Bidirectional
  Reconstruction-guided Cross-modal Knowledge Distillation
EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation
Lin Wang
Yujeong Chae
Sung-Hoon Yoon
Tae-Kyun Kim
Kuk-Jin Yoon
47
64
0
24 Nov 2021
Meta-Teacher For Face Anti-Spoofing
Meta-Teacher For Face Anti-Spoofing
Yunxiao Qin
Zitong Yu
Longbin Yan
Zezheng Wang
Chenxu Zhao
Zhen Lei
CVBM
25
61
0
12 Nov 2021
LGD: Label-guided Self-distillation for Object Detection
LGD: Label-guided Self-distillation for Object Detection
Peizhen Zhang
Zijian Kang
Tong Yang
Xinming Zhang
N. Zheng
Jian Sun
ObjD
106
30
0
23 Sep 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
32
15
0
07 Sep 2021
Dual Transfer Learning for Event-based End-task Prediction via Pluggable
  Event to Image Translation
Dual Transfer Learning for Event-based End-task Prediction via Pluggable Event to Image Translation
Lin Wang
Yujeong Chae
Kuk-Jin Yoon
32
32
0
04 Sep 2021
Online Knowledge Distillation for Efficient Pose Estimation
Online Knowledge Distillation for Efficient Pose Estimation
Zheng Li
Jingwen Ye
Xiuming Zhang
Ying Huang
Zhigeng Pan
23
94
0
04 Aug 2021
Mutual Contrastive Learning for Visual Representation Learning
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
103
75
0
26 Apr 2021
Dive into Ambiguity: Latent Distribution Mining and Pairwise Uncertainty
  Estimation for Facial Expression Recognition
Dive into Ambiguity: Latent Distribution Mining and Pairwise Uncertainty Estimation for Facial Expression Recognition
Jiahui She
Yibo Hu
Hailin Shi
Jun Wang
Qiu Shen
Tao Mei
36
186
0
01 Apr 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
27
47
0
26 Mar 2021
Student Network Learning via Evolutionary Knowledge Distillation
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
22
83
0
23 Mar 2021
DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial
  Estimation
DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation
Alexandre Ramé
Matthieu Cord
FedML
56
51
0
14 Jan 2021
Cross-Layer Distillation with Semantic Calibration
Cross-Layer Distillation with Semantic Calibration
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
45
288
0
06 Dec 2020
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized
  Deep Neural Networks
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks
Yoonho Boo
Sungho Shin
Jungwook Choi
Wonyong Sung
MQ
30
29
0
30 Sep 2020
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Wonchul Son
Jaemin Na
Junyong Choi
Wonjun Hwang
25
111
0
18 Sep 2020
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge
Chaoyang He
M. Annavaram
A. Avestimehr
FedML
26
23
0
28 Jul 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,857
0
09 Jun 2020
Multi-view Contrastive Learning for Online Knowledge Distillation
Multi-view Contrastive Learning for Online Knowledge Distillation
Chuanguang Yang
Zhulin An
Yongjun Xu
22
23
0
07 Jun 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
12
Next