ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.06526
  4. Cited By
Exploring the Benefits of Differentially Private Pre-training and
  Parameter-Efficient Fine-tuning for Table Transformers

Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table Transformers

12 September 2023
Xilong Wang
Chia-Mu Yu
Pin-Yu Chen
ArXivPDFHTML

Papers citing "Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table Transformers"

3 / 3 papers shown
Title
Differentially Private Fine-tuning of Language Models
Differentially Private Fine-tuning of Language Models
Da Yu
Saurabh Naik
A. Backurs
Sivakanth Gopi
Huseyin A. Inan
...
Y. Lee
Andre Manoel
Lukas Wutschitz
Sergey Yekhanin
Huishuai Zhang
134
347
0
13 Oct 2021
Federated Learning with Local Differential Privacy: Trade-offs between
  Privacy, Utility, and Communication
Federated Learning with Local Differential Privacy: Trade-offs between Privacy, Utility, and Communication
Muah Kim
Onur Gunlu
Rafael F. Schaefer
FedML
110
118
0
09 Feb 2021
TabTransformer: Tabular Data Modeling Using Contextual Embeddings
TabTransformer: Tabular Data Modeling Using Contextual Embeddings
Xin Huang
A. Khetan
Milan Cvitkovic
Zohar Karnin
ViT
LMTD
157
417
0
11 Dec 2020
1