Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning

We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string was chosen uniformly at random. A learner tries to learn from a stream of samples , where each~ is uniformly distributed over and is the inner product of and , modulo~2. We show that any algorithm for parity learning, that uses less than bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is (where is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length , as well as time complexity of per encryption/decription of each bit, and is provenly and unconditionally secure as long as the attacker uses less than memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decription.
View on arXiv