ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.11304
  4. Cited By
Simple Distillation Baselines for Improving Small Self-supervised Models

Simple Distillation Baselines for Improving Small Self-supervised Models

21 June 2021
Jindong Gu
Wei Liu
Yonglong Tian
ArXivPDFHTML

Papers citing "Simple Distillation Baselines for Improving Small Self-supervised Models"

6 / 6 papers shown
Title
Relational Self-supervised Distillation with Compact Descriptors for
  Image Copy Detection
Relational Self-supervised Distillation with Compact Descriptors for Image Copy Detection
Juntae Kim
Sungwon Woo
Jongho Nang
42
1
0
28 May 2024
Explainability and Robustness of Deep Visual Classification Models
Explainability and Robustness of Deep Visual Classification Models
Jindong Gu
AAML
47
2
0
03 Jan 2023
Improving Label-Deficient Keyword Spotting Through Self-Supervised
  Pretraining
Improving Label-Deficient Keyword Spotting Through Self-Supervised Pretraining
H. S. Bovbjerg
Zheng-Hua Tan
VLM
35
3
0
04 Oct 2022
Slimmable Networks for Contrastive Self-supervised Learning
Slimmable Networks for Contrastive Self-supervised Learning
Shuai Zhao
Xiaohan Wang
Linchao Zhu
Yi Yang
35
1
0
30 Sep 2022
On the Efficacy of Small Self-Supervised Contrastive Models without
  Distillation Signals
On the Efficacy of Small Self-Supervised Contrastive Models without Distillation Signals
Haizhou Shi
Youcai Zhang
Siliang Tang
Wenjie Zhu
Yaqian Li
Yandong Guo
Yueting Zhuang
SyDa
23
15
0
30 Jul 2021
SEED: Self-supervised Distillation For Visual Representation
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
245
190
0
12 Jan 2021
1