27
5
v1v2 (latest)

On the hardness of the Learning with Errors problem with a discrete reproducible error distribution

Abstract

In this work we show that the hardness of the Learning with Errors problem with errors taken from the discrete Gaussian distribution implies the hardness of the Learning with Errors problem with errors taken from the symmetric Skellam distribution. Due to the sample preserving search-to-decision reduction by Micciancio and Mol the same result applies to the decisional version of the problem. Thus, we provide a variant of the Learning with Errors problem that is hard based on conjecturally hard lattice problems and uses a discrete error distribution that is similar to the continuous Gaussian distribution in that it is closed under convolution. As an application of this result we construct a post-quantum cryptographic protocol for differentially private data anlysis in the distributed model. The security of this protocol is based on the hardness of the new variant of the Decisional Learning with Errors problem. A feature of this protocol is the use of the same noise for security and for differential privacy resulting in an efficiency boost.

View on arXiv
Comments on this paper