Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2309.06526
Cited By
Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table Transformers
12 September 2023
Xilong Wang
Chia-Mu Yu
Pin-Yu Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table Transformers"
3 / 3 papers shown
Title
Differentially Private Fine-tuning of Language Models
Da Yu
Saurabh Naik
A. Backurs
Sivakanth Gopi
Huseyin A. Inan
...
Y. Lee
Andre Manoel
Lukas Wutschitz
Sergey Yekhanin
Huishuai Zhang
134
347
0
13 Oct 2021
Federated Learning with Local Differential Privacy: Trade-offs between Privacy, Utility, and Communication
Muah Kim
Onur Gunlu
Rafael F. Schaefer
FedML
110
118
0
09 Feb 2021
TabTransformer: Tabular Data Modeling Using Contextual Embeddings
Xin Huang
A. Khetan
Milan Cvitkovic
Zohar Karnin
ViT
LMTD
157
417
0
11 Dec 2020
1