Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2302.02108
Cited By
Knowledge Distillation in Vision Transformers: A Critical Review
4 February 2023
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation in Vision Transformers: A Critical Review"
7 / 7 papers shown
Title
Federated Distillation for Medical Image Classification: Towards Trustworthy Computer-Aided Diagnosis
Sufen Ren
Yule Hu
Shengchao Chen
Guanjun Wang
29
1
0
02 Jul 2024
One-for-All: Bridge the Gap Between Heterogeneous Architectures in Knowledge Distillation
Zhiwei Hao
Jianyuan Guo
Kai Han
Yehui Tang
Han Hu
Yunhe Wang
Chang Xu
36
58
0
30 Oct 2023
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
149
420
0
19 Apr 2021
A Simple and Effective Positional Encoding for Transformers
Pu-Chin Chen
Henry Tsai
Srinadh Bhojanapalli
Hyung Won Chung
Yin-Wen Chang
Chun-Sung Ferng
59
62
0
18 Apr 2021
Video Transformer Network
Daniel Neimark
Omri Bar
Maya Zohar
Dotan Asselmann
ViT
198
422
0
01 Feb 2021
Transformers in Vision: A Survey
Salman Khan
Muzammal Naseer
Munawar Hayat
Syed Waqas Zamir
F. Khan
M. Shah
ViT
227
2,430
0
04 Jan 2021
The Lottery Ticket Hypothesis for Pre-trained BERT Networks
Tianlong Chen
Jonathan Frankle
Shiyu Chang
Sijia Liu
Yang Zhang
Zhangyang Wang
Michael Carbin
153
345
0
23 Jul 2020
1