ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.09044
  4. Cited By
Distilling Knowledge via Knowledge Review

Distilling Knowledge via Knowledge Review

19 April 2021
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
ArXiv (abs)PDFHTMLGithub (272★)

Papers citing "Distilling Knowledge via Knowledge Review"

50 / 215 papers shown
Title
Leveraging Foundation Models via Knowledge Distillation in Multi-Object
  Tracking: Distilling DINOv2 Features to FairMOT
Leveraging Foundation Models via Knowledge Distillation in Multi-Object Tracking: Distilling DINOv2 Features to FairMOT
Niels G. Faber
Seyed Sahand Mohamadi Ziabari
Fatemeh Karimi Nejadasl
95
3
0
25 Jul 2024
How to Train the Teacher Model for Effective Knowledge Distillation
How to Train the Teacher Model for Effective Knowledge Distillation
Shayan Mohajer Hamidi
Xizhen Deng
Renhao Tan
Linfeng Ye
Ahmed H. Salamah
97
5
0
25 Jul 2024
Continual Distillation Learning: Knowledge Distillation in Prompt-based Continual Learning
Continual Distillation Learning: Knowledge Distillation in Prompt-based Continual Learning
Qifan Zhang
Yunhui Guo
Yu Xiang
CLLVLM
171
0
0
18 Jul 2024
Relational Representation Distillation
Relational Representation Distillation
Nikolaos Giakoumoglou
Tania Stathaki
125
0
0
16 Jul 2024
HLQ: Fast and Efficient Backpropagation via Hadamard Low-rank
  Quantization
HLQ: Fast and Efficient Backpropagation via Hadamard Low-rank Quantization
Seonggon Kim
Eunhyeok Park
90
2
0
21 Jun 2024
Adaptive Teaching with Shared Classifier for Knowledge Distillation
Adaptive Teaching with Shared Classifier for Knowledge Distillation
Jaeyeon Jang
Young-Ik Kim
Jisu Lim
Hyeonseong Lee
43
0
0
12 Jun 2024
DistilDoc: Knowledge Distillation for Visually-Rich Document Applications
DistilDoc: Knowledge Distillation for Visually-Rich Document Applications
Jordy Van Landeghem
Subhajit Maity
Ayan Banerjee
Matthew Blaschko
Marie-Francine Moens
Josep Lladós
Sanket Biswas
136
2
0
12 Jun 2024
ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs
ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs
Fang Chen
Gourav Datta
Mujahid Al Rafi
Hyeran Jeon
Meng Tang
216
1
0
06 Jun 2024
Estimating Human Poses Across Datasets: A Unified Skeleton and
  Multi-Teacher Distillation Approach
Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach
Muhammad Gul Zain Ali Khan
Dhavalkumar Limbachiya
Didier Stricker
Muhammad Zeshan Afzal
3DH
96
0
0
30 May 2024
Aligning in a Compact Space: Contrastive Knowledge Distillation between
  Heterogeneous Architectures
Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures
Hongjun Wu
Li Xiao
Xingkuo Zhang
Yining Miao
103
1
0
28 May 2024
Exploring Graph-based Knowledge: Multi-Level Feature Distillation via
  Channels Relational Graph
Exploring Graph-based Knowledge: Multi-Level Feature Distillation via Channels Relational Graph
Zhiwei Wang
Jun Huang
Longhua Ma
Chengyu Wu
Hongyu Ma
93
0
0
14 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
141
1
0
22 Apr 2024
Dynamic Temperature Knowledge Distillation
Dynamic Temperature Knowledge Distillation
Yukang Wei
Yu Bai
87
5
0
19 Apr 2024
MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution
MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution
Yuxuan Jiang
Chen Feng
Fan Zhang
David Bull
SupR
125
14
0
15 Apr 2024
Lightweight Deep Learning for Resource-Constrained Environments: A
  Survey
Lightweight Deep Learning for Resource-Constrained Environments: A Survey
Hou-I Liu
Marco Galindo
Hongxia Xie
Lai-Kuan Wong
Hong-Han Shuai
Yung-Hui Li
Wen-Huang Cheng
130
66
0
08 Apr 2024
On the Surprising Efficacy of Distillation as an Alternative to
  Pre-Training Small Models
On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models
Sean Farhat
Deming Chen
118
0
0
04 Apr 2024
Federated Distillation: A Survey
Federated Distillation: A Survey
Lin Li
Jianping Gou
Baosheng Yu
Lan Du
Zhang Yiand Dacheng Tao
DDFedML
118
8
0
02 Apr 2024
Learning to Project for Cross-Task Knowledge Distillation
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
85
0
0
21 Mar 2024
Scale Decoupled Distillation
Scale Decoupled Distillation
Shicai Wei
111
6
0
20 Mar 2024
Distill2Explain: Differentiable decision trees for explainable
  reinforcement learning in energy application controllers
Distill2Explain: Differentiable decision trees for explainable reinforcement learning in energy application controllers
Gargya Gokhale
Seyed soroush Karimi madahi
Bert Claessens
Chris Develder
OffRL
59
3
0
18 Mar 2024
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
106
1
0
13 Mar 2024
$V_kD:$ Improving Knowledge Distillation using Orthogonal Projections
VkD:V_kD:Vk​D: Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
112
10
0
10 Mar 2024
Frequency Attention for Knowledge Distillation
Frequency Attention for Knowledge Distillation
Cuong Pham
Van-Anh Nguyen
Trung Le
Dinh Q. Phung
Gustavo Carneiro
Thanh-Toan Do
73
18
0
09 Mar 2024
Attention-guided Feature Distillation for Semantic Segmentation
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
220
0
0
08 Mar 2024
Learning to Maximize Mutual Information for Chain-of-Thought
  Distillation
Learning to Maximize Mutual Information for Chain-of-Thought Distillation
Xin Chen
Hanxian Huang
Yanjun Gao
Yi Wang
Jishen Zhao
Ke Ding
100
15
0
05 Mar 2024
Logit Standardization in Knowledge Distillation
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
126
75
0
03 Mar 2024
On the Road to Portability: Compressing End-to-End Motion Planner for
  Autonomous Driving
On the Road to Portability: Compressing End-to-End Motion Planner for Autonomous Driving
Kaituo Feng
Changsheng Li
Dongchun Ren
Ye Yuan
Guoren Wang
106
8
0
02 Mar 2024
Towards Robust and Efficient Cloud-Edge Elastic Model Adaptation via
  Selective Entropy Distillation
Towards Robust and Efficient Cloud-Edge Elastic Model Adaptation via Selective Entropy Distillation
Yaofo Chen
Shuaicheng Niu
Yaowei Wang
Shoukai Xu
Hengjie Song
Mingkui Tan
93
8
0
27 Feb 2024
SKILL: Similarity-aware Knowledge distILLation for Speech
  Self-Supervised Learning
SKILL: Similarity-aware Knowledge distILLation for Speech Self-Supervised Learning
Luca Zampierin
G. B. Hacene
Bac Nguyen
Mirco Ravanelli
76
3
0
26 Feb 2024
GraphKD: Exploring Knowledge Distillation Towards Document Object
  Detection with Structured Graph Creation
GraphKD: Exploring Knowledge Distillation Towards Document Object Detection with Structured Graph Creation
Ayan Banerjee
Sanket Biswas
Josep Lladós
Umapada Pal
117
2
0
17 Feb 2024
On Good Practices for Task-Specific Distillation of Large Pretrained
  Visual Models
On Good Practices for Task-Specific Distillation of Large Pretrained Visual Models
Juliette Marrie
Michael Arbel
Julien Mairal
Diane Larlus
VLMMQ
90
1
0
17 Feb 2024
Knowledge Distillation Based on Transformed Teacher Matching
Knowledge Distillation Based on Transformed Teacher Matching
Kaixiang Zheng
En-Hui Yang
109
21
0
17 Feb 2024
Transferring Ultrahigh-Field Representations for Intensity-Guided Brain
  Segmentation of Low-Field Magnetic Resonance Imaging
Transferring Ultrahigh-Field Representations for Intensity-Guided Brain Segmentation of Low-Field Magnetic Resonance Imaging
Kwanseok Oh
Jieun Lee
Da-Woon Heo
Dinggang Shen
Heung-Il Suk
53
0
0
13 Feb 2024
SepRep-Net: Multi-source Free Domain Adaptation via Model Separation And
  Reparameterization
SepRep-Net: Multi-source Free Domain Adaptation via Model Separation And Reparameterization
Ying Jin
Jiaqi Wang
Dahua Lin
80
2
0
13 Feb 2024
Vision Superalignment: Weak-to-Strong Generalization for Vision
  Foundation Models
Vision Superalignment: Weak-to-Strong Generalization for Vision Foundation Models
Jianyuan Guo
Hanting Chen
Chengcheng Wang
Kai Han
Chang Xu
Yunhe Wang
VLM
69
22
0
06 Feb 2024
Good Teachers Explain: Explanation-Enhanced Knowledge Distillation
Good Teachers Explain: Explanation-Enhanced Knowledge Distillation
Amin Parchami-Araghi
Moritz Bohle
Sukrut Rao
Bernt Schiele
FAtt
59
4
0
05 Feb 2024
Learning from Teaching Regularization: Generalizable Correlations Should
  be Easy to Imitate
Learning from Teaching Regularization: Generalizable Correlations Should be Easy to Imitate
Can Jin
Tong Che
Hongwu Peng
Yiyuan Li
Dimitris N. Metaxas
Marco Pavone
135
47
0
05 Feb 2024
EPSD: Early Pruning with Self-Distillation for Efficient Model
  Compression
EPSD: Early Pruning with Self-Distillation for Efficient Model Compression
Dong Chen
Ning Liu
Yichen Zhu
Zhengping Che
Rui Ma
Fachao Zhang
Xiaofeng Mou
Yi Chang
Jian Tang
62
4
0
31 Jan 2024
Rethinking Centered Kernel Alignment in Knowledge Distillation
Rethinking Centered Kernel Alignment in Knowledge Distillation
Zikai Zhou
Yunhang Shen
Shitong Shao
Linrui Gong
Shaohui Lin
101
4
0
22 Jan 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation
  Based on Conditional Mutual Information
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
78
15
0
16 Jan 2024
Generative Denoise Distillation: Simple Stochastic Noises Induce
  Efficient Knowledge Transfer for Dense Prediction
Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction
Zhaoge Liu
Xiaohao Xu
Yunkang Cao
Nong Sang
VLM
79
0
0
16 Jan 2024
Graph Relation Distillation for Efficient Biomedical Instance
  Segmentation
Graph Relation Distillation for Efficient Biomedical Instance Segmentation
Xiaoyu Liu
Yueyi Zhang
Zhiwei Xiong
Wei-Ping Huang
Bo Hu
Xiaoyan Sun
Feng Wu
113
0
0
12 Jan 2024
Direct Distillation between Different Domains
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Qiufeng Wang
Chen Gong
Masashi Sugiyama
128
3
0
12 Jan 2024
Compressing Deep Image Super-resolution Models
Compressing Deep Image Super-resolution Models
Yuxuan Jiang
Jakub Nawala
Fan Zhang
David Bull
96
7
0
31 Dec 2023
Cloud-Device Collaborative Learning for Multimodal Large Language Models
Cloud-Device Collaborative Learning for Multimodal Large Language Models
Guanqun Wang
Jiaming Liu
Chenxuan Li
Junpeng Ma
Yuan Zhang
...
Kevin Zhang
Maurice Chong
Ray Zhang
Yijiang Liu
Shanghang Zhang
109
8
0
26 Dec 2023
Revisiting Knowledge Distillation under Distribution Shift
Revisiting Knowledge Distillation under Distribution Shift
Songming Zhang
Ziyu Lyu
Xiaofeng Chen
72
1
0
25 Dec 2023
StableKD: Breaking Inter-block Optimization Entanglement for Stable
  Knowledge Distillation
StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation
Shiu-hong Kao
Jierun Chen
S.-H. Gary Chan
73
0
0
20 Dec 2023
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
Yi Guo
Yiqian He
Xiaoyang Li
Haotong Qin
Van Tung Pham
Yang Zhang
Shouda Liu
103
1
0
14 Dec 2023
Spatial-wise Dynamic Distillation for MLP-like Efficient Visual Fault
  Detection of Freight Trains
Spatial-wise Dynamic Distillation for MLP-like Efficient Visual Fault Detection of Freight Trains
Yang Zhang
Huilin Pan
Mingying Li
An-Chi Wang
Yang Zhou
Hongliang Ren
76
1
0
10 Dec 2023
Topology-Preserving Adversarial Training
Topology-Preserving Adversarial Training
Xiaoyue Mi
Fan Tang
Yepeng Weng
Danding Wang
Juan Cao
Sheng Tang
Peng Li
Yang Liu
102
1
0
29 Nov 2023
Previous
12345
Next