ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.07724
  4. Cited By
An Exponential Improvement on the Memorization Capacity of Deep
  Threshold Networks

An Exponential Improvement on the Memorization Capacity of Deep Threshold Networks

14 June 2021
Shashank Rajput
Kartik K. Sreenivasan
Dimitris Papailiopoulos
Amin Karbasi
ArXivPDFHTML

Papers citing "An Exponential Improvement on the Memorization Capacity of Deep Threshold Networks"

11 / 11 papers shown
Title
Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks
Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks
Jy-yong Sohn
Dohyun Kwon
Seoyeon An
Kangwook Lee
56
0
0
01 Aug 2024
Memorization with neural nets: going beyond the worst case
Memorization with neural nets: going beyond the worst case
S. Dirksen
Patrick Finke
Martin Genzel
50
0
0
30 Sep 2023
Are Transformers with One Layer Self-Attention Using Low-Rank Weight
  Matrices Universal Approximators?
Are Transformers with One Layer Self-Attention Using Low-Rank Weight Matrices Universal Approximators?
T. Kajitsuka
Issei Sato
41
16
0
26 Jul 2023
Memorization Capacity of Multi-Head Attention in Transformers
Memorization Capacity of Multi-Head Attention in Transformers
Sadegh Mahdavi
Renjie Liao
Christos Thrampoulidis
31
23
0
03 Jun 2023
Memorization Capacity of Neural Networks with Conditional Computation
Memorization Capacity of Neural Networks with Conditional Computation
Erdem Koyuncu
38
4
0
20 Mar 2023
Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at
  Irregularly Spaced Data
Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at Irregularly Spaced Data
Jonathan W. Siegel
19
2
0
02 Feb 2023
How Does a Deep Learning Model Architecture Impact Its Privacy? A
  Comprehensive Study of Privacy Attacks on CNNs and Transformers
How Does a Deep Learning Model Architecture Impact Its Privacy? A Comprehensive Study of Privacy Attacks on CNNs and Transformers
Guangsheng Zhang
B. Liu
Huan Tian
Tianqing Zhu
Ming Ding
Wanlei Zhou
PILM
MIACV
30
5
0
20 Oct 2022
Size and depth of monotone neural networks: interpolation and
  approximation
Size and depth of monotone neural networks: interpolation and approximation
Dan Mikulincer
Daniel Reichman
28
7
0
12 Jul 2022
Why Robust Generalization in Deep Learning is Difficult: Perspective of
  Expressive Power
Why Robust Generalization in Deep Learning is Difficult: Perspective of Expressive Power
Binghui Li
Jikai Jin
Han Zhong
J. Hopcroft
Liwei Wang
OOD
87
27
0
27 May 2022
On the Optimal Memorization Power of ReLU Neural Networks
On the Optimal Memorization Power of ReLU Neural Networks
Gal Vardi
Gilad Yehudai
Ohad Shamir
24
32
0
07 Oct 2021
VC dimension of partially quantized neural networks in the
  overparametrized regime
VC dimension of partially quantized neural networks in the overparametrized regime
Yutong Wang
Clayton D. Scott
25
1
0
06 Oct 2021
1