664

Quasi-Duality of Width and Depth of Neural Networks

Abstract

Here we report that the width and depth of a neural network are quasi-dual to each other; i.e., they are intrinsically connected and can be to a good degree converted from one to the other. First, we estimate the width and depth of a network to represent a partially separable function, and reveal an interchangeability between the width and depth. Then, inspired by the De Morgan law, we formulate a transformation from a general ReLU network to a wide network and a deep network respectively without compromising the function of the original network, thereby elaborating a quasi-duality of the width and depth. Additionally, effects of width and depth on optimization and generalization are also discussed.

View on arXiv
Comments on this paper