57
30

Complexity, Statistical Risk, and Metric Entropy of Deep Nets Using Total Path Variation

Abstract

For any ReLU network there is a representation in which the sum of the absolute values of the weights into each node is exactly 11, and the input layer variables are multiplied by a value VV coinciding with the total variation of the path weights. Implications are given for Gaussian complexity, Rademacher complexity, statistical risk, and metric entropy, all of which are shown to be proportional to VV. There is no dependence on the number of nodes per layer, except for the number of inputs dd. For estimation with sub-Gaussian noise, the mean square generalization error bounds that can be obtained are of order VL+logd/nV \sqrt{L + \log d}/\sqrt{n}, where LL is the number of layers and nn is the sample size.

View on arXiv
Comments on this paper