Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.01802
Cited By
Correlation Congruence for Knowledge Distillation
3 April 2019
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Correlation Congruence for Knowledge Distillation"
50 / 274 papers shown
Title
Self-Supervised Quantization-Aware Knowledge Distillation
Kaiqi Zhao
Ming Zhao
MQ
38
2
0
17 Mar 2024
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
37
1
0
13 Mar 2024
V
k
D
:
V_kD:
V
k
D
:
Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
52
10
0
10 Mar 2024
Adversarial Sparse Teacher: Defense Against Distillation-Based Model Stealing Attacks Using Adversarial Examples
Eda Yilmaz
H. Keles
AAML
16
2
0
08 Mar 2024
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
30
0
0
08 Mar 2024
A Study of Dropout-Induced Modality Bias on Robustness to Missing Video Frames for Audio-Visual Speech Recognition
Yusheng Dai
Hang Chen
Jun Du
Ruoyu Wang
Shihao Chen
Jie Ma
Haotian Wang
Chin-Hui Lee
45
4
0
07 Mar 2024
On the Effectiveness of Distillation in Mitigating Backdoors in Pre-trained Encoder
Tingxu Han
Shenghan Huang
Ziqi Ding
Dongrui Liu
Yebo Feng
...
Hanwei Qian
Cong Wu
Quanjun Zhang
Yang Liu
Zhenyu Chen
28
8
0
06 Mar 2024
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
37
56
0
03 Mar 2024
Data-efficient Large Vision Models through Sequential Autoregression
Jianyuan Guo
Zhiwei Hao
Chengcheng Wang
Yehui Tang
Han Wu
Han Hu
Kai Han
Chang Xu
VLM
38
10
0
07 Feb 2024
Progressive Multi-task Anti-Noise Learning and Distilling Frameworks for Fine-grained Vehicle Recognition
Dichao Liu
21
0
0
25 Jan 2024
Rethinking Centered Kernel Alignment in Knowledge Distillation
Zikai Zhou
Yunhang Shen
Shitong Shao
Linrui Gong
Shaohui Lin
24
1
0
22 Jan 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
37
12
0
16 Jan 2024
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Qiufeng Wang
Chen Gong
Masashi Sugiyama
55
3
0
12 Jan 2024
Revisiting Knowledge Distillation under Distribution Shift
Songming Zhang
Ziyu Lyu
Xiaofeng Chen
29
1
0
25 Dec 2023
TinySAM: Pushing the Envelope for Efficient Segment Anything Model
Han Shu
Wenshuo Li
Yehui Tang
Yiman Zhang
Yihao Chen
Houqiang Li
Yunhe Wang
Xinghao Chen
VLM
44
18
0
21 Dec 2023
StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation
Shiu-hong Kao
Jierun Chen
S.-H. Gary Chan
27
0
0
20 Dec 2023
STaR: Distilling Speech Temporal Relation for Lightweight Speech Self-Supervised Learning Models
Kangwook Jang
Sungnyun Kim
Hoi-Rim Kim
36
1
0
14 Dec 2023
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
Yi Guo
Yiqian He
Xiaoyang Li
Haotong Qin
Van Tung Pham
Yang Zhang
Shouda Liu
43
1
0
14 Dec 2023
Cosine Similarity Knowledge Distillation for Individual Class Information Transfer
Gyeongdo Ham
Seonghak Kim
Suin Lee
Jae-Hyeok Lee
Daeshik Kim
32
5
0
24 Nov 2023
Comparative Knowledge Distillation
Alex Wilf
Alex Tianyi Xu
Paul Pu Liang
A. Obolenskiy
Daniel Fried
Louis-Philippe Morency
VLM
18
1
0
03 Nov 2023
One-for-All: Bridge the Gap Between Heterogeneous Architectures in Knowledge Distillation
Zhiwei Hao
Jianyuan Guo
Kai Han
Yehui Tang
Han Hu
Yunhe Wang
Chang Xu
41
58
0
30 Oct 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
26
0
0
26 Oct 2023
I
2
^2
2
MD: 3D Action Representation Learning with Inter- and Intra-modal Mutual Distillation
Yunyao Mao
Jiajun Deng
Wen-gang Zhou
Zhenbo Lu
Wanli Ouyang
Houqiang Li
VLM
30
1
0
24 Oct 2023
Learning Unified Representations for Multi-Resolution Face Recognition
Hulingxiao He
Wu Yuan
Yidian Huang
Shilong Zhao
Wen Yuan
Hanqin Li
CVBM
15
0
0
14 Oct 2023
Noise-Tolerant Unsupervised Adapter for Vision-Language Models
Eman Ali
Dayan Guan
Muhammad Haris Khan
Abdulmotaleb Elsaddik
VLM
24
0
0
26 Sep 2023
A Sentence Speaks a Thousand Images: Domain Generalization through Distilling CLIP with Language Guidance
Zeyi Huang
Andy Zhou
Zijian Lin
Mu Cai
Haohan Wang
Yong Jae Lee
VLM
OOD
32
28
0
21 Sep 2023
Unified Contrastive Fusion Transformer for Multimodal Human Action Recognition
Kyoung Ok Yang
Junho Koh
Jun-Won Choi
28
0
0
10 Sep 2023
Knowledge Distillation Layer that Lets the Student Decide
Ada Gorgun
Y. Z. Gürbüz
Aydin Alatan
23
0
0
06 Sep 2023
MoMA: Momentum Contrastive Learning with Multi-head Attention-based Knowledge Distillation for Histopathology Image Analysis
T. Vuong
J. T. Kwak
41
6
0
31 Aug 2023
Computation-efficient Deep Learning for Computer Vision: A Survey
Yulin Wang
Yizeng Han
Chaofei Wang
Shiji Song
Qi Tian
Gao Huang
VLM
34
20
0
27 Aug 2023
MDCS: More Diverse Experts with Consistency Self-distillation for Long-tailed Recognition
QiHao Zhao
Chen Jiang
Wei Hu
Fangying Zhang
Jun Liu
35
11
0
19 Aug 2023
SRMAE: Masked Image Modeling for Scale-Invariant Deep Representations
Zhiming Wang
Lin Gu
Feng Lu
28
0
0
17 Aug 2023
MixBCT: Towards Self-Adapting Backward-Compatible Training
Yuefeng Liang
Yufeng Zhang
Shiliang Zhang
Yaowei Wang
Shengze Xiao
KenLi Li
Xiaoyu Wang
24
1
0
14 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation
Amir M. Mansourian
Rozhan Ahmadi
S. Kasaei
41
2
0
08 Aug 2023
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Jialiang Tang
Shuo Chen
Gang Niu
Masashi Sugiyama
Chenggui Gong
23
13
0
21 Jul 2023
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi Han
Yanmei Wang
Xi’ai Chen
Baichen Liu
Yandong Tang
60
1
0
03 Jul 2023
Data-Free Quantization via Mixed-Precision Compensation without Fine-Tuning
Jun Chen
Shipeng Bai
Tianxin Huang
Mengmeng Wang
Guanzhong Tian
Y. Liu
MQ
36
18
0
02 Jul 2023
Audio Embeddings as Teachers for Music Classification
Yiwei Ding
Alexander Lerch
30
5
0
30 Jun 2023
A Dimensional Structure based Knowledge Distillation Method for Cross-Modal Learning
Hui Xiong
Hongwei Dong
Jingyao Wang
J. Yu
Wen-jie Zhai
Changwen Zheng
Fanjiang Xu
Gang Hua
24
1
0
28 Jun 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
86
22
0
19 Jun 2023
VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Zhiwei Hao
Jianyuan Guo
Kai Han
Han Hu
Chang Xu
Yunhe Wang
35
16
0
25 May 2023
Deakin RF-Sensing: Experiments on Correlated Knowledge Distillation for Monitoring Human Postures with Radios
Shiva Raj Pokhrel
Jonathan Kua
Deol Satish
Phil Williams
A. Zaslavsky
S. W. Loke
Jinho D. Choi
32
4
0
24 May 2023
Decoupled Kullback-Leibler Divergence Loss
Jiequan Cui
Zhuotao Tian
Zhisheng Zhong
Xiaojuan Qi
Bei Yu
Hanwang Zhang
39
38
0
23 May 2023
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
55
68
0
23 May 2023
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
20
17
0
18 May 2023
Visual Tuning
Bruce X. B. Yu
Jianlong Chang
Haixin Wang
Lin Liu
Shijie Wang
...
Lingxi Xie
Haojie Li
Zhouchen Lin
Qi Tian
Chang Wen Chen
VLM
49
38
0
10 May 2023
DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing
Songling Zhu
Ronghua Shang
Bo Yuan
Weitong Zhang
Yangyang Li
Licheng Jiao
33
7
0
09 May 2023
CORSD: Class-Oriented Relational Self Distillation
Muzhou Yu
S. Tan
Kailu Wu
Runpei Dong
Linfeng Zhang
Kaisheng Ma
24
0
0
28 Apr 2023
Class Attention Transfer Based Knowledge Distillation
Ziyao Guo
Haonan Yan
Hui Li
Xiao-La Lin
16
61
0
25 Apr 2023
Previous
1
2
3
4
5
6
Next