Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1802.09210
Cited By
A representer theorem for deep neural networks
26 February 2018
M. Unser
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A representer theorem for deep neural networks"
22 / 22 papers shown
Title
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with Multilayer Perceptrons
Farhad Pourkamali-Anaraki
35
5
0
16 Sep 2024
Controlled Learning of Pointwise Nonlinearities in Neural-Network-Like Architectures
Michael Unser
Alexis Goujon
Stanislas Ducotterd
24
2
0
23 Aug 2024
Parseval Convolution Operators and Neural Networks
Michael Unser
Stanislas Ducotterd
23
3
0
19 Aug 2024
On the Geometry of Deep Learning
Randall Balestriero
Ahmed Imtiaz Humayun
Richard G. Baraniuk
AI4CE
39
1
0
09 Aug 2024
Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks
Fanghui Liu
L. Dadi
V. Cevher
72
2
0
29 Apr 2024
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
91
32
0
29 Apr 2023
A Neural-Network-Based Convex Regularizer for Inverse Problems
Alexis Goujon
Sebastian Neumayer
Pakshal Bohra
Stanislas Ducotterd
M. Unser
11
26
0
22 Nov 2022
Duality for Neural Networks through Reproducing Kernel Banach Spaces
L. Spek
T. J. Heeringa
Felix L. Schwenninger
C. Brune
11
13
0
09 Nov 2022
Real Image Super-Resolution using GAN through modeling of LR and HR process
Rao Muhammad Umer
C. Micheloni
32
1
0
19 Oct 2022
Sparse Deep Neural Network for Nonlinear Partial Differential Equations
Yuesheng Xu
T. Zeng
30
5
0
27 Jul 2022
Approximation of Lipschitz Functions using Deep Spline Neural Networks
Sebastian Neumayer
Alexis Goujon
Pakshal Bohra
M. Unser
21
15
0
13 Apr 2022
Fully-Connected Network on Noncompact Symmetric Space and Ridgelet Transform based on Helgason-Fourier Analysis
Sho Sonoda
Isao Ishikawa
Masahiro Ikeda
19
15
0
03 Mar 2022
A Data-Augmentation Is Worth A Thousand Samples: Exact Quantification From Analytical Augmented Sample Moments
Randall Balestriero
Ishan Misra
Yann LeCun
27
20
0
16 Feb 2022
Measuring Complexity of Learning Schemes Using Hessian-Schatten Total Variation
Shayan Aziznejad
Joaquim Campos
M. Unser
11
9
0
12 Dec 2021
Training Neural Networks for Solving 1-D Optimal Piecewise Linear Approximation
Hangcheng Dong
Jing-Xiao Liao
Yan Wang
Yixin Chen
Bingguo Liu
Dong Ye
Guodong Liu
54
0
0
14 Oct 2021
What Kinds of Functions do Deep Neural Networks Learn? Insights from Variational Spline Theory
Rahul Parhi
Robert D. Nowak
MLT
27
70
0
07 May 2021
Fast Jacobian-Vector Product for Deep Networks
Randall Balestriero
Richard Baraniuk
23
4
0
01 Apr 2021
Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
34
29
0
11 Dec 2020
Native Banach spaces for splines and variational inverse problems
M. Unser
Julien Fageot
14
17
0
24 Apr 2019
A unifying representer theorem for inverse problems and machine learning
M. Unser
13
69
0
02 Mar 2019
On the Spectral Bias of Neural Networks
Nasim Rahaman
A. Baratin
Devansh Arpit
Felix Dräxler
Min-Bin Lin
Fred Hamprecht
Yoshua Bengio
Aaron Courville
28
1,386
0
22 Jun 2018
Mad Max: Affine Spline Insights into Deep Learning
Randall Balestriero
Richard Baraniuk
AI4CE
28
78
0
17 May 2018
1