ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.13273
33
9

Near Optimal Private and Robust Linear Regression

30 January 2023
Xiyang Liu
Prateek Jain
Weihao Kong
Sewoong Oh
A. Suggala
ArXivPDFHTML
Abstract

We study the canonical statistical estimation problem of linear regression from nnn i.i.d.~examples under (ε,δ)(\varepsilon,\delta)(ε,δ)-differential privacy when some response variables are adversarially corrupted. We propose a variant of the popular differentially private stochastic gradient descent (DP-SGD) algorithm with two innovations: a full-batch gradient descent to improve sample complexity and a novel adaptive clipping to guarantee robustness. When there is no adversarial corruption, this algorithm improves upon the existing state-of-the-art approach and achieves a near optimal sample complexity. Under label-corruption, this is the first efficient linear regression algorithm to guarantee both (ε,δ)(\varepsilon,\delta)(ε,δ)-DP and robustness. Synthetic experiments confirm the superiority of our approach.

View on arXiv
Comments on this paper