15
94

Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning

Abstract

We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string x{0,1}nx \in \{0,1\}^n was chosen uniformly at random. A learner tries to learn xx from a stream of samples (a1,b1),(a2,b2)(a_1, b_1), (a_2, b_2) \ldots, where each~ata_t is uniformly distributed over {0,1}n\{0,1\}^n and btb_t is the inner product of ata_t and xx, modulo~2. We show that any algorithm for parity learning, that uses less than n225\frac{n^2}{25} bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is O(n)O(n) (where nn is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length nn, as well as time complexity of nn per encryption/decription of each bit, and is provenly and unconditionally secure as long as the attacker uses less than n225\frac{n^2}{25} memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decription.

View on arXiv
Comments on this paper