Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2212.01032
Cited By
Systematic Analysis for Pretrained Language Model Priming for Parameter-Efficient Fine-tuning
2 December 2022
Shih-Cheng Huang
Shi Wang
Min-Han Shih
Saurav Sahay
Hung-yi Lee
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Systematic Analysis for Pretrained Language Model Priming for Parameter-Efficient Fine-tuning"
6 / 6 papers shown
Title
Know Where You're Going: Meta-Learning for Parameter-Efficient Fine-Tuning
Mozhdeh Gheini
Xuezhe Ma
Jonathan May
42
5
0
25 May 2022
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Tu Vu
Brian Lester
Noah Constant
Rami Al-Rfou
Daniel Cer
VLM
LRM
137
277
0
15 Oct 2021
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks
Xiao Liu
Kaixuan Ji
Yicheng Fu
Weng Lam Tam
Zhengxiao Du
Zhilin Yang
Jie Tang
VLM
238
806
0
14 Oct 2021
CrossFit: A Few-shot Learning Challenge for Cross-task Generalization in NLP
Qinyuan Ye
Bill Yuchen Lin
Xiang Ren
214
180
0
18 Apr 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,848
0
18 Apr 2021
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
359
11,684
0
09 Mar 2017
1