22
7

A Law of Robustness beyond Isoperimetry

Abstract

We study the robust interpolation problem of arbitrary data distributions supported on a bounded space and propose a two-fold law of robustness. Robust interpolation refers to the problem of interpolating nn noisy training data points in Rd\mathbb{R}^d by a Lipschitz function. Although this problem has been well understood when the samples are drawn from an isoperimetry distribution, much remains unknown concerning its performance under generic or even the worst-case distributions. We prove a Lipschitzness lower bound Ω(n/p)\Omega(\sqrt{n/p}) of the interpolating neural network with pp parameters on arbitrary data distributions. With this result, we validate the law of robustness conjecture in prior work by Bubeck, Li, and Nagaraj on two-layer neural networks with polynomial weights. We then extend our result to arbitrary interpolating approximators and prove a Lipschitzness lower bound Ω(n1/d)\Omega(n^{1/d}) for robust interpolation. Our results demonstrate a two-fold law of robustness: i) we show the potential benefit of overparametrization for smooth data interpolation when n=poly(d)n=\mathrm{poly}(d), and ii) we disprove the potential existence of an O(1)O(1)-Lipschitz robust interpolating function when n=exp(ω(d))n=\exp(\omega(d)).

View on arXiv
Comments on this paper