Differentially Private Nonparametric Regression Under a Growth Condition

Given a real-valued hypothesis class , we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of is necessary for private learnability. Here online learnability of is characterized by the finiteness of its -sequential fat shattering dimension, , for all . In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that is privately learnable if is finite, which is a fairly restrictive condition. We show that under the relaxed condition , is privately learnable, establishing the first nonparametric private learnability guarantee for classes with diverging as . Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.
View on arXiv