Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2112.04174
Cited By
Boosting Contrastive Learning with Relation Knowledge Distillation
8 December 2021
Kai Zheng
Yuanjiang Wang
Ye Yuan
SSL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Boosting Contrastive Learning with Relation Knowledge Distillation"
5 / 5 papers shown
Title
Semi-supervised ViT knowledge distillation network with style transfer normalization for colorectal liver metastases survival prediction
Mohamed El Amine Elforaici
E. Montagnon
Francisco Perdigon Romero
W. Le
F. Azzi
Dominique Trudel
Bich Nguyen
Simon Turcotte
An Tang
Samuel Kadoury
MedIm
49
2
0
17 Nov 2023
Pixel-Wise Contrastive Distillation
Junqiang Huang
Zichao Guo
44
4
0
01 Nov 2022
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
245
190
0
12 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
281
3,378
0
09 Mar 2020
Boosting Self-Supervised Learning via Knowledge Transfer
M. Noroozi
Ananth Vinjimoor
Paolo Favaro
Hamed Pirsiavash
SSL
224
292
0
01 May 2018
1