53
2
v1v2 (latest)

Improved Sample Complexity for Private Nonsmooth Nonconvex Optimization

Main:18 Pages
Bibliography:4 Pages
1 Tables
Appendix:4 Pages
Abstract

We study differentially private (DP) optimization algorithms for stochastic and empirical objectives which are neither smooth nor convex, and propose methods that return a Goldstein-stationary point with sample complexity bounds that improve on existing works. We start by providing a single-pass (ϵ,δ)(\epsilon,\delta)-DP algorithm that returns an (α,β)(\alpha,\beta)-stationary point as long as the dataset is of size Ω~(d/αβ3+d/ϵαβ2)\widetilde{\Omega}(\sqrt{d}/\alpha\beta^{3}+d/\epsilon\alpha\beta^{2}), which is Ω(d)\Omega(\sqrt{d}) times smaller than the algorithm of Zhang et al. [2024] for this task, where dd is the dimension. We then provide a multi-pass polynomial time algorithm which further improves the sample complexity to Ω~(d/β2+d3/4/ϵα1/2β3/2)\widetilde{\Omega}\left(d/\beta^2+d^{3/4}/\epsilon\alpha^{1/2}\beta^{3/2}\right), by designing a sample efficient ERM algorithm, and proving that Goldstein-stationary points generalize from the empirical loss to the population loss.

View on arXiv
@article{kornowski2025_2410.05880,
  title={ Improved Sample Complexity for Private Nonsmooth Nonconvex Optimization },
  author={ Guy Kornowski and Daogao Liu and Kunal Talwar },
  journal={arXiv preprint arXiv:2410.05880},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.