15
22

Neural network integral representations with the ReLU activation function

Abstract

In this effort, we derive a formula for the integral representation of a shallow neural network with the ReLU activation function. We assume that the outer weighs admit a finite L1L_1-norm with respect to Lebesgue measure on the sphere. For univariate target functions we further provide a closed-form formula for all possible representations. Additionally, in this case our formula allows one to explicitly solve the least L1L_1-norm neural network representation for a given function.

View on arXiv
Comments on this paper