Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.16205
Cited By
Composed Fine-Tuning: Freezing Pre-Trained Denoising Autoencoders for Improved Generalization
29 June 2020
Sang Michael Xie
Tengyu Ma
Percy Liang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Composed Fine-Tuning: Freezing Pre-Trained Denoising Autoencoders for Improved Generalization"
5 / 5 papers shown
Title
Do we really have to filter out random noise in pre-training data for language models?
Jinghan Ru
Yuxin Xie
Xianwei Zhuang
Yuguo Yin
Zhihui Guo
Zhiming Liu
Qianli Ren
Yuexian Zou
83
2
0
10 Feb 2025
Open Domain Generalization with a Single Network by Regularization Exploiting Pre-trained Features
Inseop Chung
Kiyoon Yoo
Nojun Kwak
VLM
16
0
0
08 Dec 2023
Backdoor Learning for NLP: Recent Advances, Challenges, and Future Research Directions
Marwan Omar
SILM
AAML
25
20
0
14 Feb 2023
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,844
0
18 Apr 2021
Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models
Cheolhyoung Lee
Kyunghyun Cho
Wanmo Kang
MoE
240
205
0
25 Sep 2019
1