ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.01802
  4. Cited By
Correlation Congruence for Knowledge Distillation

Correlation Congruence for Knowledge Distillation

3 April 2019
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
ArXivPDFHTML

Papers citing "Correlation Congruence for Knowledge Distillation"

50 / 274 papers shown
Title
CAM-loss: Towards Learning Spatially Discriminative Feature
  Representations
CAM-loss: Towards Learning Spatially Discriminative Feature Representations
Chaofei Wang
J. Xiao
Yizeng Han
Qisen Yang
S. Song
Gao Huang
20
20
0
03 Sep 2021
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Bo-wen Li
Xinyang Jiang
Donglin Bai
Yuge Zhang
Ningxin Zheng
Xuanyi Dong
Lu Liu
Yuqing Yang
Dongsheng Li
14
10
0
30 Aug 2021
CoCo DistillNet: a Cross-layer Correlation Distillation Network for
  Pathological Gastric Cancer Segmentation
CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation
Wenxuan Zou
Muyi Sun
40
9
0
27 Aug 2021
Multi-granularity for knowledge distillation
Multi-granularity for knowledge distillation
Baitan Shao
Ying Chen
22
3
0
15 Aug 2021
Distilling Holistic Knowledge with Graph Neural Networks
Distilling Holistic Knowledge with Graph Neural Networks
Sheng Zhou
Yucheng Wang
Defang Chen
Jiawei Chen
Xin Wang
Can Wang
Jiajun Bu
15
54
0
12 Aug 2021
Learning Compatible Embeddings
Learning Compatible Embeddings
Qiang Meng
Chixiang Zhang
Xiaoqiang Xu
F. Zhou
VLM
26
36
0
04 Aug 2021
Hierarchical Self-supervised Augmented Knowledge Distillation
Hierarchical Self-supervised Augmented Knowledge Distillation
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
SSL
26
76
0
29 Jul 2021
Pseudo-LiDAR Based Road Detection
Pseudo-LiDAR Based Road Detection
Libo Sun
Haokui Zhang
Wei Yin
19
17
0
28 Jul 2021
Double Similarity Distillation for Semantic Image Segmentation
Double Similarity Distillation for Semantic Image Segmentation
Yingchao Feng
Xian Sun
Wenhui Diao
Jihao Li
Xin Gao
24
62
0
19 Jul 2021
WeClick: Weakly-Supervised Video Semantic Segmentation with Click
  Annotations
WeClick: Weakly-Supervised Video Semantic Segmentation with Click Annotations
Peidong Liu
Zibin He
Xiyu Yan
Yong-jia Jiang
Shutao Xia
Feng Zheng
Maowei Hu
27
10
0
07 Jul 2021
Learning Efficient Vision Transformers via Fine-Grained Manifold
  Distillation
Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation
Zhiwei Hao
Jianyuan Guo
Ding Jia
Kai Han
Yehui Tang
Chao Zhang
Dacheng Tao
Yunhe Wang
ViT
33
68
0
03 Jul 2021
Revisiting Knowledge Distillation: An Inheritance and Exploration
  Framework
Revisiting Knowledge Distillation: An Inheritance and Exploration Framework
Zhen Huang
Xu Shen
Jun Xing
Tongliang Liu
Xinmei Tian
Houqiang Li
Bing Deng
Jianqiang Huang
Xiansheng Hua
28
27
0
01 Jul 2021
DnS: Distill-and-Select for Efficient and Accurate Video Indexing and
  Retrieval
DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval
Giorgos Kordopatis-Zilos
Christos Tzelepis
Symeon Papadopoulos
I. Kompatsiaris
Ioannis Patras
27
33
0
24 Jun 2021
BERT Learns to Teach: Knowledge Distillation with Meta Learning
BERT Learns to Teach: Knowledge Distillation with Meta Learning
Wangchunshu Zhou
Canwen Xu
Julian McAuley
23
87
0
08 Jun 2021
Privileged Graph Distillation for Cold Start Recommendation
Privileged Graph Distillation for Cold Start Recommendation
Shuai Wang
Anton van den Hengel
Le Wu
Haiping Ma
Richang Hong
Meng Wang
12
28
0
31 May 2021
Semantic Relation Preserving Knowledge Distillation for Image-to-Image
  Translation
Semantic Relation Preserving Knowledge Distillation for Image-to-Image Translation
Zeqi Li
R. Jiang
P. Aarabi
GAN
VLM
28
28
0
30 Apr 2021
Spirit Distillation: A Model Compression Method with Multi-domain
  Knowledge Transfer
Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer
Zhiyuan Wu
Yu-Gang Jiang
Minghao Zhao
Chupeng Cui
Zongmin Yang
Xinhui Xue
Hong Qi
VLM
17
9
0
29 Apr 2021
Mutual Contrastive Learning for Visual Representation Learning
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
99
75
0
26 Apr 2021
Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Yanbei Chen
Yongqin Xian
A. Sophia Koepke
Ying Shan
Zeynep Akata
80
80
0
22 Apr 2021
Voice2Mesh: Cross-Modal 3D Face Model Generation from Voices
Voice2Mesh: Cross-Modal 3D Face Model Generation from Voices
Cho-Ying Wu
Ke Xu
Chin-Cheng Hsu
Ulrich Neumann
CVBM
3DH
45
4
0
21 Apr 2021
Learning from 2D: Contrastive Pixel-to-Point Knowledge Transfer for 3D
  Pretraining
Learning from 2D: Contrastive Pixel-to-Point Knowledge Transfer for 3D Pretraining
Yueh-Cheng Liu
Yu-Kai Huang
HungYueh Chiang
Hung-Ting Su
Zhe-Yu Liu
Chin-Tang Chen
Ching-Yu Tseng
Winston H. Hsu
3DPC
17
35
0
10 Apr 2021
Complementary Relation Contrastive Distillation
Complementary Relation Contrastive Distillation
Jinguo Zhu
Shixiang Tang
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
21
77
0
29 Mar 2021
Deep Ensemble Collaborative Learning by using Knowledge-transfer Graph
  for Fine-grained Object Classification
Deep Ensemble Collaborative Learning by using Knowledge-transfer Graph for Fine-grained Object Classification
Naoki Okamoto
Soma Minami
Tsubasa Hirakawa
Takayoshi Yamashita
H. Fujiyoshi
FedML
18
2
0
27 Mar 2021
Student Network Learning via Evolutionary Knowledge Distillation
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
22
83
0
23 Mar 2021
Compacting Deep Neural Networks for Internet of Things: Methods and
  Applications
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
16
38
0
20 Mar 2021
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge
  Distillation
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation
Mingi Ji
Seungjae Shin
Seunghyun Hwang
Gibeom Park
Il-Chul Moon
8
120
0
15 Mar 2021
A New Training Framework for Deep Neural Network
Zhenyan Hou
Wenxuan Fan
FedML
13
2
0
12 Mar 2021
Inter-class Discrepancy Alignment for Face Recognition
Inter-class Discrepancy Alignment for Face Recognition
Jiaheng Liu
Yudong Wu
Yichao Wu
Zhenmao Li
Ken Chen
Ding Liang
Junjie Yan
CVBM
11
2
0
02 Mar 2021
PURSUhInT: In Search of Informative Hint Points Based on Layer
  Clustering for Knowledge Distillation
PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation
Reyhan Kevser Keser
Aydin Ayanzadeh
O. A. Aghdam
Çaglar Kilcioglu
B. U. Toreyin
N. K. Üre
27
6
0
26 Feb 2021
Semantically-Conditioned Negative Samples for Efficient Contrastive
  Learning
Semantically-Conditioned Negative Samples for Efficient Contrastive Learning
J. Ó. Neill
Danushka Bollegala
16
6
0
12 Feb 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
65
143
0
05 Feb 2021
Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance
  Tradeoff Perspective
Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance Tradeoff Perspective
Helong Zhou
Liangchen Song
Jiajie Chen
Ye Zhou
Guoli Wang
Junsong Yuan
Qian Zhang
14
169
0
01 Feb 2021
Collaborative Teacher-Student Learning via Multiple Knowledge Transfer
Collaborative Teacher-Student Learning via Multiple Knowledge Transfer
Liyuan Sun
Jianping Gou
Baosheng Yu
Lan Du
Dacheng Tao
20
11
0
21 Jan 2021
CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared
  Person Re-Identification
CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification
Chaoyou Fu
Yibo Hu
Xiang Wu
Hailin Shi
Tao Mei
Ran He
25
94
0
21 Jan 2021
ISD: Self-Supervised Learning by Iterative Similarity Distillation
ISD: Self-Supervised Learning by Iterative Similarity Distillation
Ajinkya Tejankar
Soroush Abbasi Koohpayegani
Vipin Pillai
Paolo Favaro
Hamed Pirsiavash
SSL
27
44
0
16 Dec 2020
Wasserstein Contrastive Representation Distillation
Wasserstein Contrastive Representation Distillation
Liqun Chen
Dong Wang
Zhe Gan
Jingjing Liu
Ricardo Henao
Lawrence Carin
15
93
0
15 Dec 2020
Model Compression Using Optimal Transport
Model Compression Using Optimal Transport
Suhas Lohit
Michael J. Jones
18
8
0
07 Dec 2020
Cross-Layer Distillation with Semantic Calibration
Cross-Layer Distillation with Semantic Calibration
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
45
286
0
06 Dec 2020
Multi-head Knowledge Distillation for Model Compression
Multi-head Knowledge Distillation for Model Compression
Haiquan Wang
Suhas Lohit
Michael J. Jones
Y. Fu
9
5
0
05 Dec 2020
torchdistill: A Modular, Configuration-Driven Framework for Knowledge
  Distillation
torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation
Yoshitomo Matsubara
11
25
0
25 Nov 2020
Distilling Knowledge by Mimicking Features
Distilling Knowledge by Mimicking Features
G. Wang
Yifan Ge
Jianxin Wu
17
33
0
03 Nov 2020
ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for
  Face Recognition
ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition
W. Shi
Guanghui Ren
Yunpeng Chen
Shuicheng Yan
CVBM
18
7
0
31 Oct 2020
CompRess: Self-Supervised Learning by Compressing Representations
CompRess: Self-Supervised Learning by Compressing Representations
Soroush Abbasi Koohpayegani
Ajinkya Tejankar
Hamed Pirsiavash
SSL
23
89
0
28 Oct 2020
Reducing the Teacher-Student Gap via Spherical Knowledge Disitllation
Reducing the Teacher-Student Gap via Spherical Knowledge Disitllation
Jia Guo
Minghao Chen
Yao Hu
Chen Zhu
Xiaofei He
Deng Cai
16
6
0
15 Oct 2020
Locally Linear Region Knowledge Distillation
Locally Linear Region Knowledge Distillation
Xiang Deng
Zhongfei Zhang
Zhang
17
0
0
09 Oct 2020
Online Knowledge Distillation via Multi-branch Diversity Enhancement
Online Knowledge Distillation via Multi-branch Diversity Enhancement
Zheng Li
Ying Huang
Defang Chen
Tianren Luo
Ning Cai
Zhigeng Pan
11
27
0
02 Oct 2020
Improved Knowledge Distillation via Full Kernel Matrix Transfer
Improved Knowledge Distillation via Full Kernel Matrix Transfer
Qi Qian
Hao Li
Juhua Hu
4
7
0
30 Sep 2020
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Wonchul Son
Jaemin Na
Junyong Choi
Wonjun Hwang
20
110
0
18 Sep 2020
Collaborative Distillation in the Parameter and Spectrum Domains for
  Video Action Recognition
Collaborative Distillation in the Parameter and Spectrum Domains for Video Action Recognition
Haisheng Su
Jing Su
Dongliang Wang
Weihao Gan
Wei Wu
Mengmeng Wang
Junjie Yan
Yu Qiao
8
7
0
15 Sep 2020
Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
  Action Recognition
Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition
Yang Liu
Keze Wang
Guanbin Li
Liang Lin
29
87
0
01 Sep 2020
Previous
123456
Next