We study differentially private (DP) optimization algorithms for stochastic and empirical objectives which are neither smooth nor convex, and propose methods that return a Goldstein-stationary point with sample complexity bounds that improve on existing works. We start by providing a single-pass -DP algorithm that returns an -stationary point as long as the dataset is of size , which is times smaller than the algorithm of Zhang et al. [2024] for this task, where is the dimension. We then provide a multi-pass polynomial time algorithm which further improves the sample complexity to , by designing a sample efficient ERM algorithm, and proving that Goldstein-stationary points generalize from the empirical loss to the population loss.
View on arXiv@article{kornowski2025_2410.05880, title={ Improved Sample Complexity for Private Nonsmooth Nonconvex Optimization }, author={ Guy Kornowski and Daogao Liu and Kunal Talwar }, journal={arXiv preprint arXiv:2410.05880}, year={ 2025 } }