ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.12403
  4. Cited By
PATS: Sensitivity-aware Noisy Learning for Pretrained Language Models

PATS: Sensitivity-aware Noisy Learning for Pretrained Language Models

22 October 2022
Yupeng Zhang
Hongzhi Zhang
Sirui Wang
Wei Yu Wu
Zhoujun Li
    AAML
ArXivPDFHTML

Papers citing "PATS: Sensitivity-aware Noisy Learning for Pretrained Language Models"

5 / 5 papers shown
Title
Bold but Cautious: Unlocking the Potential of Personalized Federated
  Learning through Cautiously Aggressive Collaboration
Bold but Cautious: Unlocking the Potential of Personalized Federated Learning through Cautiously Aggressive Collaboration
Xinghao Wu
Xuefeng Liu
Jianwei Niu
Guogang Zhu
Shaojie Tang
FedML
21
17
0
20 Sep 2023
Pre-trained Models for Natural Language Processing: A Survey
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
243
1,452
0
18 Mar 2020
FreeLB: Enhanced Adversarial Training for Natural Language Understanding
FreeLB: Enhanced Adversarial Training for Natural Language Understanding
Chen Zhu
Yu Cheng
Zhe Gan
S. Sun
Tom Goldstein
Jingjing Liu
AAML
223
438
0
25 Sep 2019
Mixout: Effective Regularization to Finetune Large-scale Pretrained
  Language Models
Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models
Cheolhyoung Lee
Kyunghyun Cho
Wanmo Kang
MoE
249
205
0
25 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1