ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.03622
  4. Cited By
Explaining Knowledge Distillation by Quantifying the Knowledge

Explaining Knowledge Distillation by Quantifying the Knowledge

7 March 2020
Xu Cheng
Zhefan Rao
Yilan Chen
Quanshi Zhang
ArXivPDFHTML

Papers citing "Explaining Knowledge Distillation by Quantifying the Knowledge"

23 / 23 papers shown
Title
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
64
1
0
22 Apr 2024
Iterative Data Smoothing: Mitigating Reward Overfitting and
  Overoptimization in RLHF
Iterative Data Smoothing: Mitigating Reward Overfitting and Overoptimization in RLHF
Banghua Zhu
Michael I. Jordan
Jiantao Jiao
31
25
0
29 Jan 2024
Zone Evaluation: Revealing Spatial Bias in Object Detection
Zone Evaluation: Revealing Spatial Bias in Object Detection
Zhaohui Zheng
Yuming Chen
Qibin Hou
Xiang Li
Ping Wang
Ming-Ming Cheng
ObjD
27
3
0
20 Oct 2023
Interpretation on Multi-modal Visual Fusion
Interpretation on Multi-modal Visual Fusion
Hao Chen
Hao Zhou
Yongjian Deng
39
0
0
19 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Frameless Graph Knowledge Distillation
Frameless Graph Knowledge Distillation
Dai Shi
Zhiqi Shao
Yi Guo
Junbin Gao
39
4
0
13 Jul 2023
Exploring the Lottery Ticket Hypothesis with Explainability Methods:
  Insights into Sparse Network Performance
Exploring the Lottery Ticket Hypothesis with Explainability Methods: Insights into Sparse Network Performance
Shantanu Ghosh
Kayhan Batmanghelich
30
0
0
07 Jul 2023
Towards a Smaller Student: Capacity Dynamic Distillation for Efficient
  Image Retrieval
Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval
Yi Xie
Huaidong Zhang
Xuemiao Xu
Jianqing Zhu
Shengfeng He
VLM
21
13
0
16 Mar 2023
Graph-based Knowledge Distillation: A survey and experimental evaluation
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
33
8
0
27 Feb 2023
Practical Knowledge Distillation: Using DNNs to Beat DNNs
Practical Knowledge Distillation: Using DNNs to Beat DNNs
Chungman Lee
Pavlos Anastasios Apostolopulos
Igor L. Markov
FedML
27
1
0
23 Feb 2023
Teacher-Student Architecture for Knowledge Learning: A Survey
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
A General Multiple Data Augmentation Based Framework for Training Deep
  Neural Networks
A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks
Bin Hu
Yu Sun
•. A. K. Qin
AI4CE
34
0
0
29 May 2022
Class-Incremental Continual Learning into the eXtended DER-verse
Class-Incremental Continual Learning into the eXtended DER-verse
Matteo Boschini
Lorenzo Bonicelli
Pietro Buzzega
Angelo Porrello
Simone Calderara
CLL
BDL
32
128
0
03 Jan 2022
Boosting Unsupervised Domain Adaptation with Soft Pseudo-label and
  Curriculum Learning
Boosting Unsupervised Domain Adaptation with Soft Pseudo-label and Curriculum Learning
Shengjia Zhang
Tiancheng Lin
Yi Tian Xu
30
5
0
03 Dec 2021
Visualizing the Emergence of Intermediate Visual Patterns in DNNs
Visualizing the Emergence of Intermediate Visual Patterns in DNNs
Mingjie Li
Shaobo Wang
Quanshi Zhang
44
11
0
05 Nov 2021
Instance-Conditional Knowledge Distillation for Object Detection
Instance-Conditional Knowledge Distillation for Object Detection
Zijian Kang
Peizhen Zhang
Xinming Zhang
Jian Sun
N. Zheng
27
76
0
25 Oct 2021
Visualizing the embedding space to explain the effect of knowledge
  distillation
Visualizing the embedding space to explain the effect of knowledge distillation
Hyun Seung Lee
C. Wallraven
14
1
0
09 Oct 2021
Partial to Whole Knowledge Distillation: Progressive Distilling
  Decomposed Knowledge Boosts Student Better
Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better
Xuanyang Zhang
Xinming Zhang
Jian Sun
25
1
0
26 Sep 2021
Dynamic Knowledge Distillation With Noise Elimination for RGB-D Salient
  Object Detection
Dynamic Knowledge Distillation With Noise Elimination for RGB-D Salient Object Detection
Guangyu Ren
Yinxiao Yu
Hengyan Liu
Tania Stathaki
36
5
0
17 Jun 2021
Knowledge Distillation as Semiparametric Inference
Knowledge Distillation as Semiparametric Inference
Tri Dao
G. Kamath
Vasilis Syrgkanis
Lester W. Mackey
40
31
0
20 Apr 2021
Student Network Learning via Evolutionary Knowledge Distillation
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
22
83
0
23 Mar 2021
Interpreting and Disentangling Feature Components of Various Complexity
  from DNNs
Interpreting and Disentangling Feature Components of Various Complexity from DNNs
Jie Ren
Mingjie Li
Zexu Liu
Quanshi Zhang
CoGe
19
18
0
29 Jun 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
1