Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2408.07080
Cited By
DisCoM-KD: Cross-Modal Knowledge Distillation via Disentanglement Representation and Adversarial Learning
5 August 2024
Dino Ienco
C. Dantas
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DisCoM-KD: Cross-Modal Knowledge Distillation via Disentanglement Representation and Adversarial Learning"
9 / 9 papers shown
Title
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
111
73
0
03 Mar 2024
Class Attention Transfer Based Knowledge Distillation
Ziyao Guo
Haonan Yan
Hui Li
Xiao-La Lin
53
69
0
25 Apr 2023
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
64
143
0
29 Nov 2022
XKD: Cross-modal Knowledge Distillation with Domain Alignment for Video Representation Learning
Pritam Sarkar
Ali Etemad
98
23
0
25 Nov 2022
Towards Counterfactual Image Manipulation via CLIP
Yingchen Yu
Fangneng Zhan
Rongliang Wu
Jiahui Zhang
Shijian Lu
Miaomiao Cui
Xuansong Xie
Xiansheng Hua
Chunyan Miao
CLIP
97
33
0
06 Jul 2022
Disentangling visual and written concepts in CLIP
Joanna Materzyñska
Antonio Torralba
David Bau
CoGe
77
51
0
15 Jun 2022
Decoupled Knowledge Distillation
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
85
550
0
16 Mar 2022
Predict, Prevent, and Evaluate: Disentangled Text-Driven Image Manipulation Empowered by Pre-Trained Vision-Language Model
Zipeng Xu
Tianwei Lin
Hao Tang
Fu Li
Dongliang He
N. Sebe
Radu Timofte
Luc Van Gool
Errui Ding
EGVM
83
43
0
26 Nov 2021
Domain-Adversarial Training of Neural Networks
Yaroslav Ganin
E. Ustinova
Hana Ajakan
Pascal Germain
Hugo Larochelle
François Laviolette
M. Marchand
Victor Lempitsky
GAN
OOD
390
9,524
0
28 May 2015
1