Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2103.16367
Cited By
Complementary Relation Contrastive Distillation
29 March 2021
Jinguo Zhu
Shixiang Tang
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Complementary Relation Contrastive Distillation"
35 / 35 papers shown
Title
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
39
0
0
18 Oct 2024
Breaking Modality Gap in RGBT Tracking: Coupled Knowledge Distillation
Andong Lu
Jiacong Zhao
Chenglong Li
Yun Xiao
B. Luo
62
3
0
15 Oct 2024
Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation
Kangkai Zhang
Shiming Ge
Ruixin Shi
Dan Zeng
54
13
0
04 Sep 2024
Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
Ruixin Shi
Weijia Guo
Shiming Ge
CVBM
23
0
0
03 Sep 2024
PRG: Prompt-Based Distillation Without Annotation via Proxy Relational Graph
Yijin Xu
Jialun Liu
Hualiang Wei
Wenhui Li
38
0
0
22 Aug 2024
From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks
Xue Geng
Zhe Wang
Chunyun Chen
Qing Xu
Kaixin Xu
...
Zhenghua Chen
M. Aly
Jie Lin
Min-man Wu
Xiaoli Li
33
1
0
09 May 2024
V
k
D
:
V_kD:
V
k
D
:
Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
52
10
0
10 Mar 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
37
12
0
16 Jan 2024
Augmentation-Free Dense Contrastive Knowledge Distillation for Efficient Semantic Segmentation
Jiawei Fan
Chao Li
Xiaolong Liu
Meina Song
Anbang Yao
25
5
0
07 Dec 2023
Topology-Preserving Adversarial Training
Xiaoyue Mi
Fan Tang
Yepeng Weng
Danding Wang
Juan Cao
Sheng Tang
Peng Li
Yang Liu
54
1
0
29 Nov 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Jialiang Tang
Shuo Chen
Gang Niu
Masashi Sugiyama
Chenggui Gong
23
13
0
21 Jul 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
86
22
0
19 Jun 2023
Enhanced Multimodal Representation Learning with Cross-modal KD
Mengxi Chen
Linyu Xing
Yu Wang
Ya-Qin Zhang
28
11
0
13 Jun 2023
Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning
Kaiyou Song
Jin Xie
Shanyi Zhang
Zimeng Luo
30
29
0
13 Apr 2023
Prototype-Sample Relation Distillation: Towards Replay-Free Continual Learning
Nader Asadi
Mohammad Davar
Sudhir Mudur
Rahaf Aljundi
Eugene Belilovsky
CLL
34
35
0
26 Mar 2023
Hint-dynamic Knowledge Distillation
Yiyang Liu
Chenxin Li
Xiaotong Tu
Xinghao Ding
Yue Huang
14
1
0
30 Nov 2022
Feature-domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution
Hye-Min Moon
Jinwoo Jeong
Sungjei Kim
49
2
0
29 Nov 2022
Towards a Unified View of Affinity-Based Knowledge Distillation
Vladimir Li
A. Maki
6
0
0
30 Sep 2022
Switchable Online Knowledge Distillation
Biao Qian
Yang Wang
Hongzhi Yin
Richang Hong
Meng Wang
58
38
0
12 Sep 2022
Seeing your sleep stage: cross-modal distillation from EEG to infrared video
Jianan Han
Shenmin Zhang
Aidong Men
Yang Liu
Z. Yao
Yan-Tao Yan
Qingchao Chen
30
4
0
11 Aug 2022
Distilling Knowledge from Object Classification to Aesthetics Assessment
Jingwen Hou
Henghui Ding
Weisi Lin
Weide Liu
Yuming Fang
19
35
0
02 Jun 2022
Region-aware Knowledge Distillation for Efficient Image-to-Image Translation
Linfeng Zhang
Xin Chen
Runpei Dong
Kaisheng Ma
VLM
43
10
0
25 May 2022
Domain Invariant Masked Autoencoders for Self-supervised Learning from Multi-domains
Haiyang Yang
Meilin Chen
Yizhou Wang
Shixiang Tang
Feng Zhu
Lei Bai
Rui Zhao
Wanli Ouyang
29
16
0
10 May 2022
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning
Qiankun Gao
Chen Zhao
Guohao Li
Jian Zhang
CLL
25
61
0
24 Mar 2022
Exploring Patch-wise Semantic Relation for Contrastive Learning in Image-to-Image Translation Tasks
Chanyong Jung
Gihyun Kwon
Jong Chul Ye
29
84
0
03 Mar 2022
Anomaly Detection via Reverse Distillation from One-Class Embedding
Hanqiu Deng
Xingyu Li
UQCV
117
448
0
26 Jan 2022
Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image Recognition
Guangyu Guo
Dingwen Zhang
Longfei Han
Nian Liu
Ming-Ming Cheng
Junwei Han
21
2
0
17 Dec 2021
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
13
21
0
01 Dec 2021
Optimizing for In-memory Deep Learning with Emerging Memory Technology
Zhehui Wang
Tao Luo
Rick Siow Mong Goh
Wei Zhang
Weng-Fai Wong
18
1
0
01 Dec 2021
Layerwise Optimization by Gradient Decomposition for Continual Learning
Shixiang Tang
Dapeng Chen
Jinguo Zhu
Shijie Yu
Wanli Ouyang
CLL
18
63
0
17 May 2021
Teacher-Class Network: A Neural Network Compression Mechanism
Shaiq Munir Malik
Muhammad Umair Haider
Fnu Mohbat
Musab Rasheed
M. Taj
12
5
0
07 Apr 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,567
0
17 Apr 2017
1