11
2

Tractability of approximation by general shallow networks

Abstract

In this paper, we present a sharper version of the results in the paper Dimension independent bounds for general shallow networks; Neural Networks, \textbf{123} (2020), 142-152. Let X\mathbb{X} and Y\mathbb{Y} be compact metric spaces. We consider approximation of functions of the form xYG(x,y)dτ(y) x\mapsto\int_{\mathbb{Y}} G( x, y)d\tau( y), xX x\in\mathbb{X}, by GG-networks of the form xk=1nakG(x,yk) x\mapsto \sum_{k=1}^n a_kG( x, y_k), y1,,ynY y_1,\cdots, y_n\in\mathbb{Y}, a1,,anRa_1,\cdots, a_n\in\mathbb{R}. Defining the dimensions of X\mathbb{X} and Y\mathbb{Y} in terms of covering numbers, we obtain dimension independent bounds on the degree of approximation in terms of nn, where also the constants involved are all dependent at most polynomially on the dimensions. Applications include approximation by power rectified linear unit networks, zonal function networks, certain radial basis function networks as well as the important problem of function extension to higher dimensional spaces.

View on arXiv
Comments on this paper