17
6

Differentially Private Nonparametric Regression Under a Growth Condition

Abstract

Given a real-valued hypothesis class H\mathcal{H}, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H\mathcal{H} given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H\mathcal{H} is necessary for private learnability. Here online learnability of H\mathcal{H} is characterized by the finiteness of its η\eta-sequential fat shattering dimension, sfatη(H){\rm sfat}_\eta(\mathcal{H}), for all η>0\eta > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H\mathcal{H} is privately learnable if limη0sfatη(H)\lim_{\eta \downarrow 0} {\rm sfat}_\eta(\mathcal{H}) is finite, which is a fairly restrictive condition. We show that under the relaxed condition liminfη0ηsfatη(H)=0\lim \inf_{\eta \downarrow 0} \eta \cdot {\rm sfat}_\eta(\mathcal{H}) = 0, H\mathcal{H} is privately learnable, establishing the first nonparametric private learnability guarantee for classes H\mathcal{H} with sfatη(H){\rm sfat}_\eta(\mathcal{H}) diverging as η0\eta \downarrow 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

View on arXiv
Comments on this paper