ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.13363
12
36

Provable Memorization via Deep Neural Networks using Sub-linear Parameters

26 October 2020
Sejun Park
Jaeho Lee
Chulhee Yun
Jinwoo Shin
    FedML
    MDE
ArXivPDFHTML
Abstract

It is known that O(N)O(N)O(N) parameters are sufficient for neural networks to memorize arbitrary NNN input-label pairs. By exploiting depth, we show that O(N2/3)O(N^{2/3})O(N2/3) parameters suffice to memorize NNN pairs, under a mild condition on the separation of input points. In particular, deeper networks (even with width 333) are shown to memorize more pairs than shallow networks, which also agrees with the recent line of works on the benefits of depth for function approximation. We also provide empirical results that support our theoretical findings.

View on arXiv
Comments on this paper