ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.14444
25
57

A law of robustness for two-layers neural networks

30 September 2020
Sébastien Bubeck
Yuanzhi Li
Dheeraj M. Nagaraj
ArXivPDFHTML
Abstract

We initiate the study of the inherent tradeoffs between the size of a neural network and its robustness, as measured by its Lipschitz constant. We make a precise conjecture that, for any Lipschitz activation function and for most datasets, any two-layers neural network with kkk neurons that perfectly fit the data must have its Lipschitz constant larger (up to a constant) than n/k\sqrt{n/k}n/k​ where nnn is the number of datapoints. In particular, this conjecture implies that overparametrization is necessary for robustness, since it means that one needs roughly one neuron per datapoint to ensure a O(1)O(1)O(1)-Lipschitz network, while mere data fitting of ddd-dimensional data requires only one neuron per ddd datapoints. We prove a weaker version of this conjecture when the Lipschitz constant is replaced by an upper bound on it based on the spectral norm of the weight matrix. We also prove the conjecture in the high-dimensional regime n≈dn \approx dn≈d (which we also refer to as the undercomplete case, since only k≤dk \leq dk≤d is relevant here). Finally we prove the conjecture for polynomial activation functions of degree ppp when n≈dpn \approx d^pn≈dp. We complement these findings with experimental evidence supporting the conjecture.

View on arXiv
Comments on this paper