Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.05273
Cited By
Cross-Architecture Knowledge Distillation
12 July 2022
Yufan Liu
Jiajiong Cao
Bing Li
Weiming Hu
Jin-Fei Ding
Liang Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Cross-Architecture Knowledge Distillation"
7 / 7 papers shown
Title
HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification
Omar S. El-Assiouti
Ghada Hamed
Dina Khattab
H. M. Ebied
52
1
0
10 Jul 2024
Cross-Architecture Auxiliary Feature Space Translation for Efficient Few-Shot Personalized Object Detection
F. Barbato
Umberto Michieli
J. Moon
Pietro Zanuttigh
Mete Ozay
49
2
0
01 Jul 2024
Choosing Wisely and Learning Deeply: Selective Cross-Modality Distillation via CLIP for Domain Generalization
Jixuan Leng
Yijiang Li
Haohan Wang
VLM
37
0
0
26 Nov 2023
Cross Architecture Distillation for Face Recognition
Weisong Zhao
Xiangyu Zhu
Zhixiang He
Xiaoyu Zhang
Zhen Lei
CVBM
30
6
0
26 Jun 2023
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
155
424
0
19 Apr 2021
CelebA-Spoof: Large-Scale Face Anti-Spoofing Dataset with Rich Annotations
Yuanhan Zhang
Zhen-fei Yin
Yidong Li
Guojun Yin
Junjie Yan
Jing Shao
Ziwei Liu
CVBM
57
162
0
24 Jul 2020
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
313
39,252
0
01 Sep 2014
1