Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.04093
Cited By
Multi-view Contrastive Learning for Online Knowledge Distillation
7 June 2020
Chuanguang Yang
Zhulin An
Yongjun Xu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-view Contrastive Learning for Online Knowledge Distillation"
8 / 8 papers shown
Title
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks
Bin Hu
Yu Sun
•. A. K. Qin
AI4CE
28
0
0
29 May 2022
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
31
169
0
14 Apr 2022
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Hierarchical Self-supervised Augmented Knowledge Distillation
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
SSL
24
76
0
29 Jul 2021
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
99
75
0
26 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
195
473
0
12 Jun 2018
1