Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.01802
Cited By
Correlation Congruence for Knowledge Distillation
3 April 2019
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Correlation Congruence for Knowledge Distillation"
50 / 274 papers shown
Title
Refining CLIP's Spatial Awareness: A Visual-Centric Perspective
Congpei Qiu
Yanhao Wu
Wei Ke
Xiuxiu Bai
Tong Zhang
VLM
52
0
0
03 Apr 2025
Delving Deep into Semantic Relation Distillation
Zhaoyi Yan
Kangjun Liu
Qixiang Ye
54
0
0
27 Mar 2025
CustomKD: Customizing Large Vision Foundation for Edge Model Improvement via Knowledge Distillation
Jungsoo Lee
Debasmit Das
Munawar Hayat
Sungha Choi
Kyuwoong Hwang
Fatih Porikli
VLM
68
1
0
23 Mar 2025
Adaptive Temperature Based on Logits Correlation in Knowledge Distillation
Kazuhiro Matsuyama
Usman Anjum
Satoko Matsuyama
Tetsuo Shoda
J. Zhan
55
0
0
12 Mar 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
76
0
0
28 Feb 2025
Multi-Level Decoupled Relational Distillation for Heterogeneous Architectures
Yaoxin Yang
Peng Ye
Weihao Lin
Kangcong Li
Yan Wen
Jia Hao
Tao Chen
38
0
0
10 Feb 2025
MimicGait: A Model Agnostic approach for Occluded Gait Recognition using Correlational Knowledge Distillation
Ayush Gupta
Rama Chellappa
CVBM
35
0
0
28 Jan 2025
Variational Bayesian Adaptive Learning of Deep Latent Variables for Acoustic Knowledge Transfer
Hu Hu
Sabato Marco Siniscalchi
Chao-Han Huck Yang
Chin-Hui Lee
72
0
0
28 Jan 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
40
0
0
06 Jan 2025
Sample Correlation for Fingerprinting Deep Face Recognition
Jiyang Guan
Jian Liang
Yanbo Wang
Ran He
AAML
31
0
0
31 Dec 2024
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
68
1
0
21 Dec 2024
Neural Collapse Inspired Knowledge Distillation
Shuoxi Zhang
Zijian Song
Kun He
69
1
0
16 Dec 2024
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
74
1
0
11 Dec 2024
Toward Robust Incomplete Multimodal Sentiment Analysis via Hierarchical Representation Learning
M. Li
Dingkang Yang
Y. Liu
Shunli Wang
Jiawei Chen
...
Xiaolu Hou
Mingyang Sun
Ziyun Qian
Dongliang Kou
L. Zhang
37
1
0
05 Nov 2024
Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
Chengting Yu
Fengzhao Zhang
Ruizhe Chen
Zuozhu Liu
Shurun Tan
Er-ping Li
Aili Wang
36
2
0
03 Nov 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
29
0
0
18 Oct 2024
Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach
Luyao Zou
Yu Min Park
Chu Myaet Thwal
Y. Tun
Zhu Han
Choong Seon Hong
28
0
0
17 Oct 2024
Cyber Attacks Prevention Towards Prosumer-based EV Charging Stations: An Edge-assisted Federated Prototype Knowledge Distillation Approach
Luyao Zou
Quang Hieu Vo
Kitae Kim
Huy Q. Le
Chu Myaet Thwal
Chaoning Zhang
Choong Seon Hong
32
1
0
17 Oct 2024
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant
Guopeng Li
Qiang Wang
K. Yan
Shouhong Ding
Yuan Gao
Gui-Song Xia
41
0
0
16 Oct 2024
HASN: Hybrid Attention Separable Network for Efficient Image Super-resolution
Weifeng Cao
Xiaoyan Lei
Jun Shi
Wanyong Liang
Jie Liu
Zongfei Bai
SupR
29
0
0
13 Oct 2024
Large Model for Small Data: Foundation Model for Cross-Modal RF Human Activity Recognition
Yuxuan Weng
Guoquan Wu
Tianyue Zheng
Yanbing Yang
Jun-Jie Luo
21
5
0
13 Oct 2024
Distilling Invariant Representations with Dual Augmentation
Nikolaos Giakoumoglou
Tania Stathaki
28
0
0
12 Oct 2024
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
35
0
0
05 Oct 2024
Simple Unsupervised Knowledge Distillation With Space Similarity
Aditya Singh
Haohan Wang
31
1
0
20 Sep 2024
Unleashing the Power of Generic Segmentation Models: A Simple Baseline for Infrared Small Target Detection
Mingjin Zhang
Chi Zhang
Qiming Zhang
Yunsong Li
Xinbo Gao
Jing Zhang
VLM
30
3
0
07 Sep 2024
Data-free Distillation with Degradation-prompt Diffusion for Multi-weather Image Restoration
Pei Wang
Xiaotong Luo
Yuan Xie
Yanyun Qu
DiffM
47
1
0
05 Sep 2024
Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation
Kangkai Zhang
Shiming Ge
Ruixin Shi
Dan Zeng
51
13
0
04 Sep 2024
Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
Ruixin Shi
Weijia Guo
Shiming Ge
CVBM
18
0
0
03 Sep 2024
PRG: Prompt-Based Distillation Without Annotation via Proxy Relational Graph
Yijin Xu
Jialun Liu
Hualiang Wei
Wenhui Li
38
0
0
22 Aug 2024
Knowledge Distillation with Refined Logits
Wujie Sun
Defang Chen
Siwei Lyu
Genlang Chen
Chun-Yen Chen
Can Wang
29
1
0
14 Aug 2024
Relational Representation Distillation
Nikolaos Giakoumoglou
Tania Stathaki
34
0
0
16 Jul 2024
AMD: Automatic Multi-step Distillation of Large-scale Vision Models
Cheng Han
Qifan Wang
S. Dianat
Majid Rabbani
Raghuveer M. Rao
Yi Fang
Qiang Guan
Lifu Huang
Dongfang Liu
VLM
35
4
0
05 Jul 2024
Relative Difficulty Distillation for Semantic Segmentation
Dong Liang
Yue Sun
Yun Du
Songcan Chen
Sheng-Jun Huang
31
3
0
04 Jul 2024
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
Fadi Boutros
Vitomir Štruc
Naser Damer
49
2
0
01 Jul 2024
Highly Constrained Coded Aperture Imaging Systems Design Via a Knowledge Distillation Approach
Leon Suarez-Rodriguez
Roman Jacome
Henry Arguello
29
0
0
25 Jun 2024
Self-Supervised Representation Learning with Spatial-Temporal Consistency for Sign Language Recognition
Weichao Zhao
Wengang Zhou
Hezhen Hu
Min Wang
Houqiang Li
SLR
35
2
0
15 Jun 2024
Robust Knowledge Distillation Based on Feature Variance Against Backdoored Teacher Model
Jinyin Chen
Xiaoming Zhao
Haibin Zheng
Xiao Li
Sheng Xiang
Haifeng Guo
AAML
25
3
0
01 Jun 2024
Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach
Muhammad Gul Zain Ali Khan
Dhavalkumar Limbachiya
Didier Stricker
Muhammad Zeshan Afzal
3DH
37
0
0
30 May 2024
Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures
Hongjun Wu
Li Xiao
Xingkuo Zhang
Yining Miao
40
1
0
28 May 2024
Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch
Xin-Chun Li
Wen-Shu Fan
Bowen Tao
Le Gan
De-Chuan Zhan
30
2
0
21 May 2024
Stereo-Knowledge Distillation from dpMV to Dual Pixels for Light Field Video Reconstruction
Aryan Garg
Raghav Mallampali
Akshat Joshi
Shrisudhan Govindarajan
Kaushik Mitra
31
0
0
20 May 2024
Open-Vocabulary Object Detection via Neighboring Region Attention Alignment
Sunyuan Qiang
Xianfei Li
Yanyan Liang
Wenlong Liao
Tao He
Pai Peng
ObjD
35
0
0
14 May 2024
From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks
Xue Geng
Zhe Wang
Chunyun Chen
Qing Xu
Kaixin Xu
...
Zhenghua Chen
M. Aly
Jie Lin
Min-man Wu
Xiaoli Li
33
1
0
09 May 2024
DVMSR: Distillated Vision Mamba for Efficient Super-Resolution
Xiaoyan Lei
Wenlong Zhang
Weifeng Cao
29
11
0
05 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
58
1
0
22 Apr 2024
On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models
Sean Farhat
Deming Chen
42
0
0
04 Apr 2024
Task Integration Distillation for Object Detectors
Hai Su
ZhenWen Jian
Songsen Yu
40
1
0
02 Apr 2024
Federated Distillation: A Survey
Lin Li
Jianping Gou
Baosheng Yu
Lan Du
Zhang Yiand Dacheng Tao
DD
FedML
53
4
0
02 Apr 2024
The Need for Speed: Pruning Transformers with One Recipe
Samir Khaki
Konstantinos N. Plataniotis
32
10
0
26 Mar 2024
1
2
3
4
5
6
Next