10
3

Gradient Descent Fails to Learn High-frequency Functions and Modular Arithmetic

Abstract

Classes of target functions containing a large number of approximately orthogonal elements are known to be hard to learn by the Statistical Query algorithms. Recently this classical fact re-emerged in a theory of gradient-based optimization of neural networks. In the novel framework, the hardness of a class is usually quantified by the variance of the gradient with respect to a random choice of a target function. A set of functions of the form xaxmodpx\to ax \bmod p, where aa is taken from Zp{\mathbb Z}_p, has attracted some attention from deep learning theorists and cryptographers recently. This class can be understood as a subset of pp-periodic functions on Z{\mathbb Z} and is tightly connected with a class of high-frequency periodic functions on the real line. We present a mathematical analysis of limitations and challenges associated with using gradient-based learning techniques to train a high-frequency periodic function or modular multiplication from examples. We highlight that the variance of the gradient is negligibly small in both cases when either a frequency or the prime base pp is large. This in turn prevents such a learning algorithm from being successful.

View on arXiv
Comments on this paper