Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2210.12587
Cited By
Model ensemble instead of prompt fusion: a sample-specific knowledge transfer method for few-shot prompt tuning
23 October 2022
Xiangyu Peng
Chen Xing
Prafulla Kumar Choubey
Chien-Sheng Wu
Caiming Xiong
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Model ensemble instead of prompt fusion: a sample-specific knowledge transfer method for few-shot prompt tuning"
7 / 7 papers shown
Title
Plug-and-Play Transformer Modules for Test-Time Adaptation
Xiangyu Chang
Sk. Miraj Ahmed
S. Krishnamurthy
Başak Güler
A. Swami
Samet Oymak
A. Roy-Chowdhury
30
0
0
06 Jan 2024
PANDA: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation
Qihuang Zhong
Liang Ding
Juhua Liu
Bo Du
Dacheng Tao
VLM
CLL
29
41
0
22 Aug 2022
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Tu Vu
Brian Lester
Noah Constant
Rami Al-Rfou
Daniel Cer
VLM
LRM
137
277
0
15 Oct 2021
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks
Xiao Liu
Kaixuan Ji
Yicheng Fu
Weng Lam Tam
Zhengxiao Du
Zhilin Yang
Jie Tang
VLM
238
806
0
14 Oct 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,848
0
18 Apr 2021
Exploiting Cloze Questions for Few Shot Text Classification and Natural Language Inference
Timo Schick
Hinrich Schütze
258
1,589
0
21 Jan 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1