On minimal representations of shallow ReLU networks

The realization function of a shallow ReLU network is a continuous and piecewise affine function , where the domain is partitioned by a set of hyperplanes into cells on which is affine. We show that the minimal representation for uses either , or neurons and we characterize each of the three cases. In the particular case, where the input layer is one-dimensional, minimal representations always use at most neurons but in all higher dimensional settings there are functions for which neurons are needed. Then we show that the set of minimal networks representing forms a -submanifold and we derive the dimension and the number of connected components of . Additionally, we give a criterion for the hyperplanes that guarantees that all continuous, piecewise affine functions are realization functions of appropriate ReLU networks.
View on arXiv