Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.03355
Cited By
A Generalization Theory of Cross-Modality Distillation with Contrastive Learning
6 May 2024
Hangyu Lin
Chen Liu
Chengming Xu
Zhengqi Gao
Yanwei Fu
Yuan Yao
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Generalization Theory of Cross-Modality Distillation with Contrastive Learning"
5 / 5 papers shown
Title
ONE-PEACE: Exploring One General Representation Model Toward Unlimited Modalities
Peng Wang
Shijie Wang
Junyang Lin
Shuai Bai
Xiaohuan Zhou
Jingren Zhou
Xinggang Wang
Chang Zhou
VLM
MLLM
ObjD
73
119
0
18 May 2023
Toward Understanding the Feature Learning Process of Self-supervised Contrastive Learning
Zixin Wen
Yuanzhi Li
SSL
MLT
49
133
0
31 May 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
283
190
0
12 Jan 2021
Bootstrap your own latent: A new approach to self-supervised Learning
Jean-Bastien Grill
Florian Strub
Florent Altché
Corentin Tallec
Pierre Harvey Richemond
...
M. G. Azar
Bilal Piot
Koray Kavukcuoglu
Rémi Munos
Michal Valko
SSL
306
6,718
0
13 Jun 2020
Momentum Contrast for Unsupervised Visual Representation Learning
Kaiming He
Haoqi Fan
Yuxin Wu
Saining Xie
Ross B. Girshick
SSL
143
12,007
0
13 Nov 2019
1