17
28

Private Non-smooth Empirical Risk Minimization and Stochastic Convex Optimization in Subquadratic Steps

Abstract

We study the differentially private Empirical Risk Minimization (ERM) and Stochastic Convex Optimization (SCO) problems for non-smooth convex functions. We get a (nearly) optimal bound on the excess empirical risk and excess population loss with subquadratic gradient complexity. More precisely, our differentially private algorithm requires O(N3/2d1/8+N2d)O(\frac{N^{3/2}}{d^{1/8}}+ \frac{N^2}{d}) gradient queries for optimal excess empirical risk, which is achieved with the help of subsampling and smoothing the function via convolution. This is the first subquadratic algorithm for the non-smooth case when dd is super constant. As a direct application, using the iterative localization approach of Feldman et al. \cite{fkt20}, we achieve the optimal excess population loss for stochastic convex optimization problem, with O(min{N5/4d1/8,N3/2d1/8})O(\min\{N^{5/4}d^{1/8},\frac{ N^{3/2}}{d^{1/8}}\}) gradient queries. Our work makes progress towards resolving a question raised by Bassily et al. \cite{bfgt20}, giving first algorithms for private ERM and SCO with subquadratic steps. We note that independently Asi et al. \cite{afkt21} gave other algorithms for private ERM and SCO with subquadratic steps.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.