Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.01802
Cited By
Correlation Congruence for Knowledge Distillation
3 April 2019
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Correlation Congruence for Knowledge Distillation"
50 / 274 papers shown
Title
ACT-Net: Asymmetric Co-Teacher Network for Semi-supervised Memory-efficient Medical Image Segmentation
Ziyuan Zhao
An Zhu
Zeng Zeng
B. Veeravalli
Cuntai Guan
24
9
0
05 Jul 2022
Representative Teacher Keys for Knowledge Distillation Model Compression Based on Attention Mechanism for Image Classification
Jun-Teng Yang
Sheng-Che Kao
S. Huang
6
0
0
26 Jun 2022
Toward Student-Oriented Teacher Network Training For Knowledge Distillation
Chengyu Dong
Liyuan Liu
Jingbo Shang
40
6
0
14 Jun 2022
The Modality Focusing Hypothesis: Towards Understanding Crossmodal Knowledge Distillation
Zihui Xue
Zhengqi Gao
Sucheng Ren
Hang Zhao
27
37
0
13 Jun 2022
SERE: Exploring Feature Self-relation for Self-supervised Transformer
Zhong-Yu Li
Shanghua Gao
Ming-Ming Cheng
ViT
MDE
26
14
0
10 Jun 2022
Evaluation-oriented Knowledge Distillation for Deep Face Recognition
Y. Huang
Jiaxiang Wu
Xingkun Xu
Shouhong Ding
CVBM
15
32
0
06 Jun 2022
ORC: Network Group-based Knowledge Distillation using Online Role Change
Jun-woo Choi
Hyeon Cho
Seockhwa Jeong
Wonjun Hwang
11
3
0
01 Jun 2022
What Knowledge Gets Distilled in Knowledge Distillation?
Utkarsh Ojha
Yuheng Li
Anirudh Sundara Rajan
Yingyu Liang
Yong Jae Lee
FedML
29
18
0
31 May 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
37
46
0
28 May 2022
Improving the Latent Space of Image Style Transfer
Yun-Hao Bai
Cairong Wang
C. Yuan
Yanbo Fan
Jue Wang
DRL
34
0
0
24 May 2022
PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection
Linfeng Zhang
Runpei Dong
Hung-Shuo Tai
Kaisheng Ma
3DPC
72
47
0
23 May 2022
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
20
104
0
22 May 2022
Knowledge Distillation from A Stronger Teacher
Tao Huang
Shan You
Fei Wang
Chao Qian
Chang Xu
17
235
0
21 May 2022
Learning Monocular Depth Estimation via Selective Distillation of Stereo Knowledge
Kyeongseob Song
Kuk-Jin Yoon
MDE
17
5
0
18 May 2022
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
31
7
0
13 May 2022
Spot-adaptive Knowledge Distillation
Mingli Song
Ying Chen
Jingwen Ye
Mingli Song
20
72
0
05 May 2022
Attention-based Knowledge Distillation in Multi-attention Tasks: The Impact of a DCT-driven Loss
Alejandro López-Cifuentes
Marcos Escudero-Viñolo
Jesús Bescós
Juan C. Sanmiguel
20
1
0
04 May 2022
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Proto2Proto: Can you recognize the car, the way I do?
Monish Keswani
Sriranjani Ramakrishnan
Nishant Reddy
V. Balasubramanian
8
26
0
25 Apr 2022
Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Quyang Pan
Junbo Zhang
Zeju Li
Qing Liu
FedML
29
24
0
14 Apr 2022
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
37
169
0
14 Apr 2022
CoupleFace: Relation Matters for Face Recognition Distillation
Jiaheng Liu
Haoyu Qin
Yichao Wu
Jinyang Guo
Ding Liang
Ke Xu
CVBM
21
19
0
12 Apr 2022
Non-Local Latent Relation Distillation for Self-Adaptive 3D Human Pose Estimation
Jogendra Nath Kundu
Siddharth Seth
Anirudh Gururaj Jamkhandi
Pradyumna
Varun Jampani
Anirban Chakraborty
R. Venkatesh Babu
3DH
18
9
0
05 Apr 2022
Feature Structure Distillation with Centered Kernel Alignment in BERT Transferring
Heeseung Jung
Doyeon Kim
Seung-Hoon Na
Kangil Kim
22
5
0
01 Apr 2022
R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
Huan Wang
Jian Ren
Zeng Huang
Kyle Olszewski
Menglei Chai
Yun Fu
Sergey Tulyakov
42
80
0
31 Mar 2022
PCA-Based Knowledge Distillation Towards Lightweight and Content-Style Balanced Photorealistic Style Transfer Models
Tai-Yin Chiu
Danna Gurari
23
19
0
25 Mar 2022
SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images
Yongwei Wang
Yuheng Wang
Tim K. Lee
C. Miao
Z. J. Wang
21
74
0
22 Mar 2022
Cross-Modal Perceptionist: Can Face Geometry be Gleaned from Voices?
Cho-Ying Wu
Chin-Cheng Hsu
Ulrich Neumann
CVBM
6
14
0
18 Mar 2022
Decoupled Knowledge Distillation
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
14
522
0
16 Mar 2022
Graph Flow: Cross-layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation
Wen Zou
Muyi Sun
38
18
0
16 Mar 2022
Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-efficiency, and Better Transferability
Ruifei He
Shuyang Sun
Jihan Yang
Song Bai
Xiaojuan Qi
29
36
0
10 Mar 2022
Exploring Patch-wise Semantic Relation for Contrastive Learning in Image-to-Image Translation Tasks
Chanyong Jung
Gihyun Kwon
Jong Chul Ye
29
84
0
03 Mar 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
30
4
0
25 Feb 2022
Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation
Li Liu
Qingle Huang
Sihao Lin
Hongwei Xie
Bing Wang
Xiaojun Chang
Xiao-Xue Liang
28
100
0
08 Feb 2022
Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition
Kuan-Chuan Peng
41
2
0
04 Feb 2022
Contrastive Neighborhood Alignment
Pengkai Zhu
Zhaowei Cai
Yuanjun Xiong
Z. Tu
Luis Goncalves
Vijay Mahadevan
Stefano Soatto
8
2
0
06 Jan 2022
Auxiliary Learning for Self-Supervised Video Representation via Similarity-based Knowledge Distillation
Amirhossein Dadashzadeh
Alan Whone
Majid Mirmehdi
SSL
21
4
0
07 Dec 2021
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
13
21
0
01 Dec 2021
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
29
2
0
29 Nov 2021
One to Transfer All: A Universal Transfer Framework for Vision Foundation Model with Few Data
Yujie Wang
Junqin Huang
Mengya Gao
Yichao Wu
Zhen-fei Yin
Ding Liang
Junjie Yan
14
0
0
24 Nov 2021
Semi-Online Knowledge Distillation
Zhiqiang Liu
Yanxia Liu
Chengkai Huang
19
5
0
23 Nov 2021
A Survey on Green Deep Learning
Jingjing Xu
Wangchunshu Zhou
Zhiyi Fu
Hao Zhou
Lei Li
VLM
73
83
0
08 Nov 2021
Estimating and Maximizing Mutual Information for Knowledge Distillation
A. Shrivastava
Yanjun Qi
Vicente Ordonez
19
5
0
29 Oct 2021
A Variational Bayesian Approach to Learning Latent Variables for Acoustic Knowledge Transfer
Hu Hu
Sabato Marco Siniscalchi
Chao-Han Huck Yang
Chin-Hui Lee
BDL
35
6
0
16 Oct 2021
Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Jian Yang
Zhigeng Pan
26
2
0
01 Oct 2021
Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better
Xuanyang Zhang
Xinming Zhang
Jian Sun
25
1
0
26 Sep 2021
Excavating the Potential Capacity of Self-Supervised Monocular Depth Estimation
Rui Peng
Ronggang Wang
Yawen Lai
Luyang Tang
Yangang Cai
MDE
67
72
0
26 Sep 2021
Weakly-Supervised Monocular Depth Estimationwith Resolution-Mismatched Data
Jialei Xu
Yuanchao Bai
Xianming Liu
Junjun Jiang
Xiangyang Ji
MDE
41
5
0
23 Sep 2021
On the Efficiency of Subclass Knowledge Distillation in Classification Tasks
A. Sajedi
Konstantinos N. Plataniotis
16
4
0
12 Sep 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Previous
1
2
3
4
5
6
Next