ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.00121
  4. Cited By
WARP: Word-level Adversarial ReProgramming
v1v2 (latest)

WARP: Word-level Adversarial ReProgramming

Annual Meeting of the Association for Computational Linguistics (ACL), 2021
1 January 2021
Karen Hambardzumyan
Hrant Khachatrian
Jonathan May
    AAML
ArXiv (abs)PDFHTML

Papers citing "WARP: Word-level Adversarial ReProgramming"

9 / 209 papers shown
Title
Compacter: Efficient Low-Rank Hypercomplex Adapter Layers
Compacter: Efficient Low-Rank Hypercomplex Adapter LayersNeural Information Processing Systems (NeurIPS), 2021
Rabeeh Karimi Mahabadi
James Henderson
Sebastian Ruder
MoE
260
560
0
08 Jun 2021
Cross-Attention is All You Need: Adapting Pretrained Transformers for
  Machine Translation
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine TranslationConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Mozhdeh Gheini
Xiang Ren
Jonathan May
LRM
172
138
0
18 Apr 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
The Power of Scale for Parameter-Efficient Prompt TuningConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
1.2K
4,729
0
18 Apr 2021
Surface Form Competition: Why the Highest Probability Answer Isn't
  Always Right
Surface Form Competition: Why the Highest Probability Answer Isn't Always RightConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Ari Holtzman
Peter West
Vered Schwartz
Yejin Choi
Luke Zettlemoyer
LRM
493
256
0
16 Apr 2021
KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization
  for Relation Extraction
KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation ExtractionThe Web Conference (WWW), 2021
Xiang Chen
Ningyu Zhang
Xin Xie
Shumin Deng
Yunzhi Yao
Chuanqi Tan
Fei Huang
Luo Si
Huajun Chen
328
456
0
15 Apr 2021
Structural Adapters in Pretrained Language Models for AMR-to-text
  Generation
Structural Adapters in Pretrained Language Models for AMR-to-text GenerationConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Leonardo F. R. Ribeiro
Yue Zhang
Iryna Gurevych
151
75
0
16 Mar 2021
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Prefix-Tuning: Optimizing Continuous Prompts for GenerationAnnual Meeting of the Association for Computational Linguistics (ACL), 2021
Xiang Lisa Li
Abigail Z. Jacobs
529
4,983
0
01 Jan 2021
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf
  Language Models
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language ModelsAnnual Meeting of the Association for Computational Linguistics (ACL), 2020
Peter West
Ximing Lu
Ari Holtzman
Chandra Bhagavatula
Jena D. Hwang
Yejin Choi
OffRL
182
14
0
16 Oct 2020
Pre-trained Models for Natural Language Processing: A Survey
Pre-trained Models for Natural Language Processing: A SurveyScience China Technological Sciences (Sci China Technol Sci), 2020
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MAVLM
639
1,577
0
18 Mar 2020
Previous
12345