Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.06461
Cited By
Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning
13 April 2023
Kaiyou Song
Jin Xie
Shanyi Zhang
Zimeng Luo
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning"
33 / 33 papers shown
Title
Visual Prompt Tuning
Menglin Jia
Luming Tang
Bor-Chun Chen
Claire Cardie
Serge Belongie
Bharath Hariharan
Ser-Nam Lim
VLM
VPVLM
136
1,615
0
23 Mar 2022
UniVIP: A Unified Framework for Self-Supervised Visual Pre-training
Zhaowen Li
Yousong Zhu
Fan Yang
Wei Li
Chaoyang Zhao
...
Jiahao Xie
Liwei Wu
Rui Zhao
Ming Tang
Jinqiao Wang
46
37
0
14 Mar 2022
HCSC: Hierarchical Contrastive Selective Coding
Yuanfan Guo
Minghao Xu
Jiawen Li
Bingbing Ni
Xuanyu Zhu
Zhenbang Sun
Yi Tian Xu
52
73
0
01 Feb 2022
SimReg: Regression as a Simple Yet Effective Tool for Self-supervised Knowledge Distillation
K. Navaneet
Soroush Abbasi Koohpayegani
Ajinkya Tejankar
Hamed Pirsiavash
42
20
0
13 Jan 2022
Boosting Contrastive Learning with Relation Knowledge Distillation
Kai Zheng
Yuanjiang Wang
Ye Yuan
SSL
34
13
0
08 Dec 2021
Decoupled Contrastive Learning
Chun-Hsiao Yeh
Cheng-Yao Hong
Yen-Chi Hsu
Tyng-Luh Liu
Yubei Chen
Yann LeCun
215
185
0
13 Oct 2021
Mean Shift for Self-Supervised Learning
Soroush Abbasi Koohpayegani
Ajinkya Tejankar
Hamed Pirsiavash
SSL
53
93
0
15 May 2021
VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning
Adrien Bardes
Jean Ponce
Yann LeCun
SSL
DML
149
931
0
11 May 2021
With a Little Help from My Friends: Nearest-Neighbor Contrastive Learning of Visual Representations
Debidatta Dwibedi
Y. Aytar
Jonathan Tompson
P. Sermanet
Andrew Zisserman
SSL
228
467
0
29 Apr 2021
Distill on the Go: Online knowledge distillation in self-supervised learning
Prashant Shivaram Bhat
Elahe Arani
Bahram Zonooz
SSL
47
28
0
20 Apr 2021
An Empirical Study of Training Self-Supervised Vision Transformers
Xinlei Chen
Saining Xie
Kaiming He
ViT
146
1,857
0
05 Apr 2021
Complementary Relation Contrastive Distillation
Jinguo Zhu
Shixiang Tang
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
57
79
0
29 Mar 2021
Barlow Twins: Self-Supervised Learning via Redundancy Reduction
Jure Zbontar
Li Jing
Ishan Misra
Yann LeCun
Stéphane Deny
SSL
274
2,338
0
04 Mar 2021
Momentum^2 Teacher: Momentum Teacher with Momentum Statistics for Self-Supervised Learning
Zeming Li
Songtao Liu
Jian Sun
98
16
0
19 Jan 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
290
191
0
12 Jan 2021
OBoW: Online Bag-of-Visual-Words Generation for Self-Supervised Learning
Spyros Gidaris
Andrei Bursuc
Gilles Puy
N. Komodakis
Matthieu Cord
P. Pérez
SSL
53
71
0
21 Dec 2020
Cross-Layer Distillation with Semantic Calibration
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
69
297
0
06 Dec 2020
Exploring Simple Siamese Representation Learning
Xinlei Chen
Kaiming He
SSL
245
4,036
0
20 Nov 2020
AdCo: Adversarial Contrast for Efficient Learning of Unsupervised Representations from Self-Trained Negative Adversaries
Q. Hu
Tianlin Li
Wei Hu
Guo-Jun Qi
SSL
38
153
0
17 Nov 2020
CompRess: Self-Supervised Learning by Compressing Representations
Soroush Abbasi Koohpayegani
Ajinkya Tejankar
Hamed Pirsiavash
SSL
61
90
0
28 Oct 2020
Unsupervised Learning of Visual Features by Contrasting Cluster Assignments
Mathilde Caron
Ishan Misra
Julien Mairal
Priya Goyal
Piotr Bojanowski
Armand Joulin
OCL
SSL
213
4,065
0
17 Jun 2020
Bootstrap your own latent: A new approach to self-supervised Learning
Jean-Bastien Grill
Florian Strub
Florent Altché
Corentin Tallec
Pierre Harvey Richemond
...
M. G. Azar
Bilal Piot
Koray Kavukcuoglu
Rémi Munos
Michal Valko
SSL
329
6,773
0
13 Jun 2020
Knowledge Distillation Meets Self-Supervision
Guodong Xu
Ziwei Liu
Xiaoxiao Li
Chen Change Loy
FedML
69
283
0
12 Jun 2020
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
446
3,422
0
09 Mar 2020
Momentum Contrast for Unsupervised Visual Representation Learning
Kaiming He
Haoqi Fan
Yuxin Wu
Saining Xie
Ross B. Girshick
SSL
163
12,050
0
13 Nov 2019
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
141
1,044
0
23 Oct 2019
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
Mingxing Tan
Quoc V. Le
3DV
MedIm
129
18,058
0
28 May 2019
Searching for MobileNetV3
Andrew G. Howard
Mark Sandler
Grace Chu
Liang-Chieh Chen
Bo Chen
...
Yukun Zhu
Ruoming Pang
Vijay Vasudevan
Quoc V. Le
Hartwig Adam
319
6,737
0
06 May 2019
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
275
476
0
12 Jun 2018
Deep Mutual Learning
Ying Zhang
Tao Xiang
Timothy M. Hospedales
Huchuan Lu
FedML
143
1,650
0
01 Jun 2017
SGDR: Stochastic Gradient Descent with Warm Restarts
I. Loshchilov
Frank Hutter
ODL
288
8,091
0
13 Aug 2016
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
274
3,870
0
19 Dec 2014
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
1.4K
39,472
0
01 Sep 2014
1