We study the canonical statistical task of computing the principal component from i.i.d.~data in dimensions under -differential privacy. Although extensively studied in literature, existing solutions fall short on two key aspects: () even for Gaussian data, existing private algorithms require the number of samples to scale super-linearly with , i.e., , to obtain non-trivial results while non-private PCA requires only , and () existing techniques suffer from a non-vanishing error even when the randomness in each data point is arbitrarily small. We propose DP-PCA, which is a single-pass algorithm that overcomes both limitations. It is based on a private minibatch gradient ascent method that relies on {\em private mean estimation}, which adds minimal noise required to ensure privacy by adapting to the variance of a given minibatch of gradients. For sub-Gaussian data, we provide nearly optimal statistical error rates even for . Furthermore, we provide a lower bound showing that sub-Gaussian style assumption is necessary in obtaining the optimal error rate.
View on arXiv