Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.09044
Cited By
Distilling Knowledge via Knowledge Review
19 April 2021
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
Re-assign community
ArXiv (abs)
PDF
HTML
Github (272★)
Papers citing
"Distilling Knowledge via Knowledge Review"
50 / 215 papers shown
Title
GenRecal: Generation after Recalibration from Large to Small Vision-Language Models
Byung-Kwan Lee
Ryo Hachiuma
Yong Man Ro
Yu-Chun Wang
Yueh-Hua Wu
VLM
38
0
0
18 Jun 2025
I
2
^2
2
S-TFCKD: Intra-Inter Set Knowledge Distillation with Time-Frequency Calibration for Speech Enhancement
Jiaming Cheng
Ruiyu Liang
Chao Xu
Ye Ni
Wei Zhou
Björn W. Schuller
Xiaoshuai Hao
16
0
0
16 Jun 2025
VLCD: Vision-Language Contrastive Distillation for Accurate and Efficient Automatic Placenta Analysis
Manas Mehta
Yimu Pan
Kelly Gallagher
Alison D. Gernand
Jeffery A. Goldstein
Delia Mwinyelle
Leena Mithal
J. Z. Wang
21
0
0
02 Jun 2025
SCOUT: Teaching Pre-trained Language Models to Enhance Reasoning via Flow Chain-of-Thought
Guanghao Li
Wenhao Jiang
Mingfeng Chen
Yan Li
Hao Yu
Shuting Dong
Tao Ren
Ming Tang
Chun Yuan
ReLM
LRM
28
0
0
30 May 2025
InfoSAM: Fine-Tuning the Segment Anything Model from An Information-Theoretic Perspective
Yuanhong Zhang
Muyao Yuan
Weizhan Zhang
Tieliang Gong
Wen Wen
Jiangyong Ying
Weijie Shi
VLM
34
0
0
28 May 2025
DOGe: Defensive Output Generation for LLM Protection Against Knowledge Distillation
Pingzhi Li
Zhen Tan
Huaizhi Qu
Huan Liu
Tianlong Chen
AAML
42
0
0
26 May 2025
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
Haiduo Huang
Jiangcheng Song
Yadong Zhang
Pengju Ren
69
0
0
21 May 2025
Selective Structured State Space for Multispectral-fused Small Target Detection
Qianqian Zhang
WeiJun Wang
Yunxing Liu
Li Zhou
Hao Zhao
Junshe An
Zihan Wang
Mamba
248
0
0
20 May 2025
FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer
Seonghak Kim
121
0
0
17 May 2025
MoKD: Multi-Task Optimization for Knowledge Distillation
Zeeshan Hayder
A. Cheraghian
Lars Petersson
Mehrtash Harandi
VLM
144
0
0
13 May 2025
Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks
Tianqing Zhang
Zixin Zhu
Kairong Yu
Hongwei Wang
468
0
0
29 Apr 2025
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
164
0
0
27 Apr 2025
Weather-Aware Object Detection Transformer for Domain Adaptation
Soheil Gharatappeh
Salimeh Yasaei Sekeh
Vikas Dhiman
ViT
73
0
0
15 Apr 2025
HDC: Hierarchical Distillation for Multi-level Noisy Consistency in Semi-Supervised Fetal Ultrasound Segmentation
Tran Quoc Khanh Le
Nguyen Lan Vi Vu
Ha-Hieu Pham
Xuan-Loc Huynh
T. Nguyen
Minh Huu Nhat Le
Quan Nguyen
Hien Nguyen
94
0
0
14 Apr 2025
Random Conditioning with Distillation for Data-Efficient Diffusion Model Compression
Dohyun Kim
S. Park
Geonhee Han
Seung Wook Kim
Paul Hongsuck Seo
DiffM
106
0
0
02 Apr 2025
Delving Deep into Semantic Relation Distillation
Zhaoyi Yan
Kangjun Liu
Qixiang Ye
86
0
0
27 Mar 2025
Distilling Stereo Networks for Performant and Efficient Leaner Networks
Rafia Rahim
Samuel Woerz
A. Zell
180
0
0
24 Mar 2025
Adaptive Temperature Based on Logits Correlation in Knowledge Distillation
Kazuhiro Matsuyama
Usman Anjum
Satoko Matsuyama
Tetsuo Shoda
J. Zhan
141
0
0
12 Mar 2025
Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence
Zhaowei Chen
Borui Zhao
Yuchen Ge
Yuhao Chen
Renjie Song
Jiajun Liang
77
0
0
09 Mar 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
207
0
0
28 Feb 2025
Optimal Brain Apoptosis
Mingyuan Sun
Zheng Fang
Jiaxu Wang
Junjie Jiang
Delei Kong
Chenming Hu
Yuetong Fang
Renjing Xu
AAML
103
0
0
25 Feb 2025
FedMHO: Heterogeneous One-Shot Federated Learning Towards Resource-Constrained Edge Devices
Dezhong Yao
Yuexin Shi
Tongtong Liu
Zhiqiang Xu
115
1
0
12 Feb 2025
Multi-Level Decoupled Relational Distillation for Heterogeneous Architectures
Yaoxin Yang
Peng Ye
Weihao Lin
Kangcong Li
Yan Wen
Jia Hao
Tao Chen
94
0
0
10 Feb 2025
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
103
0
0
09 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
101
0
0
13 Jan 2025
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
162
2
0
11 Dec 2024
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
141
1
0
13 Nov 2024
Over-parameterized Student Model via Tensor Decomposition Boosted Knowledge Distillation
Yu-Liang Zhan
Zhong-Yi Lu
Hao Sun
Ze-Feng Gao
75
0
0
10 Nov 2024
Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
Chengting Yu
Fengzhao Zhang
Ruizhe Chen
Zuozhu Liu
Shurun Tan
Er-ping Li
Aili Wang
88
2
0
03 Nov 2024
Multi-Level Feature Distillation of Joint Teachers Trained on Distinct Image Datasets
Adrian Iordache
B. Alexe
Radu Tudor Ionescu
140
1
0
29 Oct 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
91
0
0
18 Oct 2024
CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Zao Zhang
Huaming Chen
Pei Ning
Nan Yang
Dong Yuan
61
1
0
17 Oct 2024
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant
Guopeng Li
Qiang Wang
K. Yan
Shouhong Ding
Yuan Gao
Gui-Song Xia
86
0
0
16 Oct 2024
Convex Distillation: Efficient Compression of Deep Networks via Convex Optimization
Prateek Varshney
Mert Pilanci
165
0
0
09 Oct 2024
Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching
Wenqi Niu
Yingchao Wang
Guohui Cai
Hanpo Hou
54
1
0
09 Oct 2024
On Efficient Variants of Segment Anything Model: A Survey
Xiaorui Sun
Jing Liu
Jikang Cheng
Xiaofeng Zhu
Ping Hu
VLM
143
7
0
07 Oct 2024
ReTok: Replacing Tokenizer to Enhance Representation Efficiency in Large Language Model
Shuhao Gu
Mengdi Zhao
Bowen Zhang
Liangdong Wang
Jijie Li
Guang Liu
66
3
0
06 Oct 2024
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
84
0
0
05 Oct 2024
Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation
Yaxin Peng
Yaomin Huang
Haokun Zhu
Jinsong Fan
Guixu Zhang
81
1
0
27 Sep 2024
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Yaxin Peng
Faming Fang
Guixu Zhang
91
0
0
27 Sep 2024
Enhancing Logits Distillation with Plug\&Play Kendall's
τ
τ
τ
Ranking Loss
Yuchen Guan
Runxi Cheng
Kang Liu
Chun Yuan
86
0
0
26 Sep 2024
Applications of Knowledge Distillation in Remote Sensing: A Survey
Yassine Himeur
N. Aburaed
O. Elharrouss
Iraklis Varlamis
Shadi Atalla
W. Mansoor
Hussain Al Ahmad
92
4
0
18 Sep 2024
EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysis
Shaojie Li
Zhaoshuo Diao
61
0
0
18 Sep 2024
Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Xinlei Huang
Jialiang Tang
Xubin Zheng
Jinjia Zhou
Wenxin Yu
Ning Jiang
85
0
0
12 Sep 2024
LoCa: Logit Calibration for Knowledge Distillation
Runming Yang
Taiqiang Wu
Yujiu Yang
81
1
0
07 Sep 2024
Adaptive Explicit Knowledge Transfer for Knowledge Distillation
H. Park
Jong-seok Lee
53
1
0
03 Sep 2024
MMDRFuse: Distilled Mini-Model with Dynamic Refresh for Multi-Modality Image Fusion
Yanglin Deng
Tianyang Xu
Chunyang Cheng
Xiao-Jun Wu
Josef Kittler
65
3
0
28 Aug 2024
Attend-Fusion: Efficient Audio-Visual Fusion for Video Classification
Mahrukh Awan
Asmar Nadeem
Muhammad Junaid Awan
Armin Mustafa
Syed Sameed Husain
65
1
0
26 Aug 2024
Knowledge Distillation with Refined Logits
Wujie Sun
Defang Chen
Siwei Lyu
Genlang Chen
Chun-Yen Chen
Can Wang
105
3
0
14 Aug 2024
Deep Companion Learning: Enhancing Generalization Through Historical Consistency
Ruizhao Zhu
Venkatesh Saligrama
FedML
87
0
0
26 Jul 2024
1
2
3
4
5
Next