ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.21053
24
1

Computable Lipschitz Bounds for Deep Neural Networks

28 October 2024
Moreno Pintore
Bruno Després
ArXivPDFHTML
Abstract

Deriving sharp and computable upper bounds of the Lipschitz constant of deep neural networks is crucial to formally guarantee the robustness of neural-network based models. We analyse three existing upper bounds written for the l2l^2l2 norm. We highlight the importance of working with the l1l^1l1 and l∞l^\inftyl∞ norms and we propose two novel bounds for both feed-forward fully-connected neural networks and convolutional neural networks. We treat the technical difficulties related to convolutional neural networks with two different methods, called explicit and implicit. Several numerical tests empirically confirm the theoretical results, help to quantify the relationship between the presented bounds and establish the better accuracy of the new bounds. Four numerical tests are studied: two where the output is derived from an analytical closed form are proposed; another one with random matrices; and the last one for convolutional neural networks trained on the MNIST dataset. We observe that one of our bound is optimal in the sense that it is exact for the first test with the simplest analytical form and it is better than other bounds for the other tests.

View on arXiv
Comments on this paper