Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.07724
Cited By
An Exponential Improvement on the Memorization Capacity of Deep Threshold Networks
14 June 2021
Shashank Rajput
Kartik K. Sreenivasan
Dimitris Papailiopoulos
Amin Karbasi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"An Exponential Improvement on the Memorization Capacity of Deep Threshold Networks"
11 / 11 papers shown
Title
Memorization Capacity for Additive Fine-Tuning with Small ReLU Networks
Jy-yong Sohn
Dohyun Kwon
Seoyeon An
Kangwook Lee
54
0
0
01 Aug 2024
Memorization with neural nets: going beyond the worst case
S. Dirksen
Patrick Finke
Martin Genzel
50
0
0
30 Sep 2023
Are Transformers with One Layer Self-Attention Using Low-Rank Weight Matrices Universal Approximators?
T. Kajitsuka
Issei Sato
38
16
0
26 Jul 2023
Memorization Capacity of Multi-Head Attention in Transformers
Sadegh Mahdavi
Renjie Liao
Christos Thrampoulidis
31
23
0
03 Jun 2023
Memorization Capacity of Neural Networks with Conditional Computation
Erdem Koyuncu
38
4
0
20 Mar 2023
Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at Irregularly Spaced Data
Jonathan W. Siegel
19
2
0
02 Feb 2023
How Does a Deep Learning Model Architecture Impact Its Privacy? A Comprehensive Study of Privacy Attacks on CNNs and Transformers
Guangsheng Zhang
B. Liu
Huan Tian
Tianqing Zhu
Ming Ding
Wanlei Zhou
PILM
MIACV
30
5
0
20 Oct 2022
Size and depth of monotone neural networks: interpolation and approximation
Dan Mikulincer
Daniel Reichman
28
7
0
12 Jul 2022
Why Robust Generalization in Deep Learning is Difficult: Perspective of Expressive Power
Binghui Li
Jikai Jin
Han Zhong
J. Hopcroft
Liwei Wang
OOD
87
27
0
27 May 2022
On the Optimal Memorization Power of ReLU Neural Networks
Gal Vardi
Gilad Yehudai
Ohad Shamir
24
32
0
07 Oct 2021
VC dimension of partially quantized neural networks in the overparametrized regime
Yutong Wang
Clayton D. Scott
25
1
0
06 Oct 2021
1