Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.03236
Cited By
Cross-Layer Distillation with Semantic Calibration
6 December 2020
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Cross-Layer Distillation with Semantic Calibration"
50 / 118 papers shown
Title
JointDistill: Adaptive Multi-Task Distillation for Joint Depth Estimation and Scene Segmentation
Tiancong Cheng
Ying Zhang
Y. Liang
R. Zimmermann
Zhiwen Yu
Bin Guo
VLM
16
0
0
15 May 2025
DNAD: Differentiable Neural Architecture Distillation
Xuan Rao
Bo Zhao
Derong Liu
34
1
0
25 Apr 2025
Distilling Knowledge from Heterogeneous Architectures for Semantic Segmentation
Y. Huang
Kai Hu
Y. Zhang
Z. Chen
Xieping Gao
40
0
0
10 Apr 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
73
0
0
28 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
68
1
0
21 Dec 2024
Neural Collapse Inspired Knowledge Distillation
Shuoxi Zhang
Zijian Song
Kun He
69
1
0
16 Dec 2024
SWITCH: Studying with Teacher for Knowledge Distillation of Large Language Models
Jahyun Koo
Yerin Hwang
Yongil Kim
Taegwan Kang
Hyunkyung Bae
Kyomin Jung
57
0
0
25 Oct 2024
Breaking Modality Gap in RGBT Tracking: Coupled Knowledge Distillation
Andong Lu
Jiacong Zhao
Chenglong Li
Yun Xiao
B. Luo
59
3
0
15 Oct 2024
LOBG:Less Overfitting for Better Generalization in Vision-Language Model
Chenhao Ding
Xinyuan Gao
Songlin Dong
Yuhang He
Qiang Wang
Alex C. Kot
Yihong Gong
VLM
34
1
0
14 Oct 2024
Distilling Invariant Representations with Dual Augmentation
Nikolaos Giakoumoglou
Tania Stathaki
21
0
0
12 Oct 2024
Conditional Image Synthesis with Diffusion Models: A Survey
Zheyuan Zhan
Defang Chen
Jian-Ping Mei
Zhenghe Zhao
Jiawei Chen
Chun Chen
Siwei Lyu
Can Wang
VLM
42
5
0
28 Sep 2024
Applications of Knowledge Distillation in Remote Sensing: A Survey
Yassine Himeur
N. Aburaed
O. Elharrouss
Iraklis Varlamis
Shadi Atalla
W. Mansoor
Hussain Al Ahmad
45
4
0
18 Sep 2024
Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning
Amin Karimi Monsefi
Mengxi Zhou
Nastaran Karimi Monsefi
Ser-Nam Lim
Wei-Lun Chao
R. Ramnath
44
1
0
16 Sep 2024
Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation
Kangkai Zhang
Shiming Ge
Ruixin Shi
Dan Zeng
49
13
0
04 Sep 2024
Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
Ruixin Shi
Weijia Guo
Shiming Ge
CVBM
18
0
0
03 Sep 2024
Make a Strong Teacher with Label Assistance: A Novel Knowledge Distillation Approach for Semantic Segmentation
Shoumeng Qiu
Jie Chen
Xinrun Li
Ru Wan
Xiangyang Xue
Jian Pu
VLM
35
3
0
18 Jul 2024
Relational Representation Distillation
Nikolaos Giakoumoglou
Tania Stathaki
32
0
0
16 Jul 2024
A Survey on Symbolic Knowledge Distillation of Large Language Models
Kamal Acharya
Alvaro Velasquez
H. Song
SyDa
38
4
0
12 Jul 2024
3M-Health: Multimodal Multi-Teacher Knowledge Distillation for Mental Health Detection
R. Cabral
Siwen Luo
Josiah Poon
S. Han
23
0
0
12 Jul 2024
Reprogramming Distillation for Medical Foundation Models
Yuhang Zhou
Siyuan Du
Haolin Li
Jiangchao Yao
Ya Zhang
Yanfeng Wang
43
2
0
09 Jul 2024
AMD: Automatic Multi-step Distillation of Large-scale Vision Models
Cheng Han
Qifan Wang
S. Dianat
Majid Rabbani
Raghuveer M. Rao
Yi Fang
Qiang Guan
Lifu Huang
Dongfang Liu
VLM
33
4
0
05 Jul 2024
SelfReg-UNet: Self-Regularized UNet for Medical Image Segmentation
Wenhui Zhu
Xiwen Chen
Peijie Qiu
Mohammad Farazi
Aristeidis Sotiras
Abolfazl Razi
Yalin Wang
SSeg
SSL
38
10
0
21 Jun 2024
Lightweight Model Pre-training via Language Guided Knowledge Distillation
Mingsheng Li
Lin Zhang
Mingzhen Zhu
Zilong Huang
Gang Yu
Jiayuan Fan
Tao Chen
41
1
0
17 Jun 2024
Adaptive Teaching with Shared Classifier for Knowledge Distillation
Jaeyeon Jang
Young-Ik Kim
Jisu Lim
Hyeonseong Lee
19
0
0
12 Jun 2024
DistilDoc: Knowledge Distillation for Visually-Rich Document Applications
Jordy Van Landeghem
Subhajit Maity
Ayan Banerjee
Matthew Blaschko
Marie-Francine Moens
Josep Lladós
Sanket Biswas
43
2
0
12 Jun 2024
Lightweight Deep Learning for Resource-Constrained Environments: A Survey
Hou-I Liu
Marco Galindo
Hongxia Xie
Lai-Kuan Wong
Hong-Han Shuai
Yung-Hui Li
Wen-Huang Cheng
55
48
0
08 Apr 2024
On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models
Sean Farhat
Deming Chen
37
0
0
04 Apr 2024
Improve Knowledge Distillation via Label Revision and Data Selection
Weichao Lan
Yiu-ming Cheung
Qing Xu
Buhua Liu
Zhikai Hu
Mengke Li
Zhenghua Chen
30
2
0
03 Apr 2024
Scale Decoupled Distillation
Shicai Wei
39
4
0
20 Mar 2024
TIE-KD: Teacher-Independent and Explainable Knowledge Distillation for Monocular Depth Estimation
Sangwon Choi
Daejune Choi
Duksu Kim
27
4
0
22 Feb 2024
Data Distribution Distilled Generative Model for Generalized Zero-Shot Recognition
Yijie Wang
Mingjian Hong
Luwen Huangfu
Shengyue Huang
SyDa
39
6
0
18 Feb 2024
GraphKD: Exploring Knowledge Distillation Towards Document Object Detection with Structured Graph Creation
Ayan Banerjee
Sanket Biswas
Josep Lladós
Umapada Pal
38
1
0
17 Feb 2024
On Good Practices for Task-Specific Distillation of Large Pretrained Visual Models
Juliette Marrie
Michael Arbel
Julien Mairal
Diane Larlus
VLM
MQ
40
1
0
17 Feb 2024
FedD2S: Personalized Data-Free Federated Knowledge Distillation
Kawa Atapour
S. J. Seyedmohammadi
J. Abouei
Arash Mohammadi
Konstantinos N. Plataniotis
FedML
22
2
0
16 Feb 2024
NutePrune: Efficient Progressive Pruning with Numerous Teachers for Large Language Models
Shengrui Li
Junzhe Chen
Xueting Han
Jing Bai
22
6
0
15 Feb 2024
Cooperative Knowledge Distillation: A Learner Agnostic Approach
Michael J. Livanos
Ian Davidson
Stephen Wong
19
0
0
02 Feb 2024
A Deep Hierarchical Feature Sparse Framework for Occluded Person Re-Identification
Yihu Song
Shuaishi Liu
23
1
0
15 Jan 2024
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Joey Tianyi Zhou
Chen Gong
Masashi Sugiyama
47
3
0
12 Jan 2024
Knowledge Translation: A New Pathway for Model Compression
Wujie Sun
Defang Chen
Jiawei Chen
Yan Feng
Chun-Yen Chen
Can Wang
25
0
0
11 Jan 2024
Temporal Knowledge Distillation for Time-Sensitive Financial Services Applications
Hongda Shen
Eren Kurshan
AAML
11
1
0
28 Dec 2023
StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation
Shiu-hong Kao
Jierun Chen
S.-H. Gary Chan
17
0
0
20 Dec 2023
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
Yi Guo
Yiqian He
Xiaoyang Li
Haotong Qin
Van Tung Pham
Yang Zhang
Shouda Liu
43
1
0
14 Dec 2023
SpliceMix: A Cross-scale and Semantic Blending Augmentation Strategy for Multi-label Image Classification
Lei Wang
Yibing Zhan
Leilei Ma
Dapeng Tao
Liang Ding
Chen Gong
21
1
0
26 Nov 2023
Education distillation:getting student models to learn in shcools
Ling Feng
Danyang Li
Tianhao Wu
Xuliang Duan
Xuliang Duan
FedML
22
0
0
23 Nov 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
26
0
0
26 Oct 2023
Leveraging Vision-Language Models for Improving Domain Generalization in Image Classification
Sravanti Addepalli
Ashish Ramayee Asokan
Lakshay Sharma
R. V. Babu
VLM
24
15
0
12 Oct 2023
Atom-Motif Contrastive Transformer for Molecular Property Prediction
Wentao Yu
Shuo Chen
Chen Gong
Gang Niu
Masashi Sugiyama
ViT
37
2
0
11 Oct 2023
Bidirectional Knowledge Reconfiguration for Lightweight Point Cloud Analysis
Peipei Li
Xing Cui
Yibo Hu
Man Zhang
Ting Yao
Tao Mei
25
0
0
08 Oct 2023
LumiNet: The Bright Side of Perceptual Knowledge Distillation
Md. Ismail Hossain
M. M. L. Elahi
Sameera Ramasinghe
A. Cheraghian
Fuad Rahman
Nabeel Mohammed
Shafin Rahman
24
1
0
05 Oct 2023
1
2
3
Next