Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2210.12403
Cited By
PATS: Sensitivity-aware Noisy Learning for Pretrained Language Models
22 October 2022
Yupeng Zhang
Hongzhi Zhang
Sirui Wang
Wei Yu Wu
Zhoujun Li
AAML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"PATS: Sensitivity-aware Noisy Learning for Pretrained Language Models"
5 / 5 papers shown
Title
Bold but Cautious: Unlocking the Potential of Personalized Federated Learning through Cautiously Aggressive Collaboration
Xinghao Wu
Xuefeng Liu
Jianwei Niu
Guogang Zhu
Shaojie Tang
FedML
21
17
0
20 Sep 2023
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
243
1,452
0
18 Mar 2020
FreeLB: Enhanced Adversarial Training for Natural Language Understanding
Chen Zhu
Yu Cheng
Zhe Gan
S. Sun
Tom Goldstein
Jingjing Liu
AAML
223
438
0
25 Sep 2019
Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models
Cheolhyoung Lee
Kyunghyun Cho
Wanmo Kang
MoE
249
205
0
25 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1