Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.11304
Cited By
Simple Distillation Baselines for Improving Small Self-supervised Models
21 June 2021
Jindong Gu
Wei Liu
Yonglong Tian
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Simple Distillation Baselines for Improving Small Self-supervised Models"
6 / 6 papers shown
Title
Relational Self-supervised Distillation with Compact Descriptors for Image Copy Detection
Juntae Kim
Sungwon Woo
Jongho Nang
42
1
0
28 May 2024
Explainability and Robustness of Deep Visual Classification Models
Jindong Gu
AAML
47
2
0
03 Jan 2023
Improving Label-Deficient Keyword Spotting Through Self-Supervised Pretraining
H. S. Bovbjerg
Zheng-Hua Tan
VLM
35
3
0
04 Oct 2022
Slimmable Networks for Contrastive Self-supervised Learning
Shuai Zhao
Xiaohan Wang
Linchao Zhu
Yi Yang
35
1
0
30 Sep 2022
On the Efficacy of Small Self-Supervised Contrastive Models without Distillation Signals
Haizhou Shi
Youcai Zhang
Siliang Tang
Wenjie Zhu
Yaqian Li
Yandong Guo
Yueting Zhuang
SyDa
23
15
0
30 Jul 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
245
190
0
12 Jan 2021
1