Improved Approximation Properties of Dictionaries and Applications to Neural Networks

This article addresses the problem of approximating a function in a Hilbert space by an expansion over a dictionary . We introduce the notion of a smoothly parameterized dictionary and give upper bounds on the approximation rates, metric entropy and -widths of the absolute convex hull, which we denote , of such dictionaries. The upper bounds depend upon the order of smoothness of the parameterization, and improve upon existing results in many cases. The main applications of these results is to the dictionaries corresponding to shallow neural networks with activation function , and to the dictionary of decaying Fourier modes corresponding to the spectral Barron space. This improves upon existing approximation rates for shallow neural networks when for , sharpens bounds on the metric entropy, and provides the first bounds on the Gelfand -widths of the Barron space and spectral Barron space.
View on arXiv