Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.15750
Cited By
iDAT: inverse Distillation Adapter-Tuning
23 March 2024
Jiacheng Ruan
Jingsheng Gao
Mingye Xie
Daize Dong
Suncheng Xiang
Ting Liu
Yuzhuo Fu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"iDAT: inverse Distillation Adapter-Tuning"
4 / 4 papers shown
Title
GIST: Improving Parameter Efficient Fine Tuning via Knowledge Interaction
Jiacheng Ruan
Jingsheng Gao
Mingye Xie
Suncheng Xiang
Zefang Yu
Ting Liu
Yuzhuo Fu
MoE
48
4
0
12 Dec 2023
AdaptFormer: Adapting Vision Transformers for Scalable Visual Recognition
Shoufa Chen
Chongjian Ge
Zhan Tong
Jiangliu Wang
Yibing Song
Jue Wang
Ping Luo
146
637
0
26 May 2022
A Survey of Visual Transformers
Yang Liu
Yao Zhang
Yixin Wang
Feng Hou
Jin Yuan
Jiang Tian
Yang Zhang
Zhongchao Shi
Jianping Fan
Zhiqiang He
3DGS
ViT
71
330
0
11 Nov 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
115
100
0
12 Feb 2021
1