26
11

Faster Rates of Private Stochastic Convex Optimization

Abstract

In this paper, we revisit the problem of Differentially Private Stochastic Convex Optimization (DP-SCO) and provide excess population risks for some special classes of functions that are faster than the previous results of general convex and strongly convex functions. In the first part of the paper, we study the case where the population risk function satisfies the Tysbakov Noise Condition (TNC) with some parameter θ>1\theta>1. Specifically, we first show that under some mild assumptions on the loss functions, there is an algorithm whose output could achieve an upper bound of O~((1n+dlog1δnϵ)θθ1)\tilde{O}((\frac{1}{\sqrt{n}}+\frac{\sqrt{d\log \frac{1}{\delta}}}{n\epsilon})^\frac{\theta}{\theta-1}) for (ϵ,δ)(\epsilon, \delta)-DP when θ2\theta\geq 2, here nn is the sample size and dd is the dimension of the space. Then we address the inefficiency issue, improve the upper bounds by Poly(logn)\text{Poly}(\log n) factors and extend to the case where θθˉ>1\theta\geq \bar{\theta}>1 for some known θˉ\bar{\theta}. Next we show that the excess population risk of population functions satisfying TNC with parameter θ2\theta\geq 2 is always lower bounded by Ω((dnϵ)θθ1)\Omega((\frac{d}{n\epsilon})^\frac{\theta}{\theta-1}) and Ω((dlog1δnϵ)θθ1)\Omega((\frac{\sqrt{d\log \frac{1}{\delta}}}{n\epsilon})^\frac{\theta}{\theta-1}) for ϵ\epsilon-DP and (ϵ,δ)(\epsilon, \delta)-DP, respectively. In the second part, we focus on a special case where the population risk function is strongly convex. Unlike the previous studies, here we assume the loss function is {\em non-negative} and {\em the optimal value of population risk is sufficiently small}. With these additional assumptions, we propose a new method whose output could achieve an upper bound of O(dlog1δn2ϵ2+1nτ)O(\frac{d\log\frac{1}{\delta}}{n^2\epsilon^2}+\frac{1}{n^{\tau}}) for any τ1\tau\geq 1 in (ϵ,δ)(\epsilon,\delta)-DP model if the sample size nn is sufficiently large.

View on arXiv
Comments on this paper