Papers
Communities
Organizations
Events
Blog
Pricing
Search
Open menu
Home
Papers
2101.00121
Cited By
v1
v2 (latest)
WARP: Word-level Adversarial ReProgramming
1 January 2021
Karen Hambardzumyan
Hrant Khachatrian
Jonathan May
AAML
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"WARP: Word-level Adversarial ReProgramming"
8 / 208 papers shown
Title
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation
Mozhdeh Gheini
Xiang Ren
Jonathan May
LRM
99
116
0
18 Apr 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
962
4,135
0
18 Apr 2021
Surface Form Competition: Why the Highest Probability Answer Isn't Always Right
Ari Holtzman
Peter West
Vered Schwartz
Yejin Choi
Luke Zettlemoyer
LRM
229
240
0
16 Apr 2021
KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction
Xiang Chen
Ningyu Zhang
Xin Xie
Shumin Deng
Yunzhi Yao
Chuanqi Tan
Fei Huang
Luo Si
Huajun Chen
195
420
0
15 Apr 2021
Structural Adapters in Pretrained Language Models for AMR-to-text Generation
Leonardo F. R. Ribeiro
Yue Zhang
Iryna Gurevych
100
72
0
16 Mar 2021
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Xiang Lisa Li
Percy Liang
357
4,346
0
01 Jan 2021
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language Models
Peter West
Ximing Lu
Ari Holtzman
Chandra Bhagavatula
Jena D. Hwang
Yejin Choi
OffRL
72
13
0
16 Oct 2020
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
476
1,500
0
18 Mar 2020
Previous
1
2
3
4
5