ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.09210
  4. Cited By
A representer theorem for deep neural networks

A representer theorem for deep neural networks

26 February 2018
M. Unser
ArXivPDFHTML

Papers citing "A representer theorem for deep neural networks"

22 / 22 papers shown
Title
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with
  Multilayer Perceptrons
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with Multilayer Perceptrons
Farhad Pourkamali-Anaraki
35
5
0
16 Sep 2024
Controlled Learning of Pointwise Nonlinearities in Neural-Network-Like Architectures
Controlled Learning of Pointwise Nonlinearities in Neural-Network-Like Architectures
Michael Unser
Alexis Goujon
Stanislas Ducotterd
24
2
0
23 Aug 2024
Parseval Convolution Operators and Neural Networks
Parseval Convolution Operators and Neural Networks
Michael Unser
Stanislas Ducotterd
23
3
0
19 Aug 2024
On the Geometry of Deep Learning
On the Geometry of Deep Learning
Randall Balestriero
Ahmed Imtiaz Humayun
Richard G. Baraniuk
AI4CE
39
1
0
09 Aug 2024
Learning with Norm Constrained, Over-parameterized, Two-layer Neural
  Networks
Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks
Fanghui Liu
L. Dadi
V. Cevher
72
2
0
29 Apr 2024
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
91
32
0
29 Apr 2023
A Neural-Network-Based Convex Regularizer for Inverse Problems
A Neural-Network-Based Convex Regularizer for Inverse Problems
Alexis Goujon
Sebastian Neumayer
Pakshal Bohra
Stanislas Ducotterd
M. Unser
11
26
0
22 Nov 2022
Duality for Neural Networks through Reproducing Kernel Banach Spaces
Duality for Neural Networks through Reproducing Kernel Banach Spaces
L. Spek
T. J. Heeringa
Felix L. Schwenninger
C. Brune
11
13
0
09 Nov 2022
Real Image Super-Resolution using GAN through modeling of LR and HR
  process
Real Image Super-Resolution using GAN through modeling of LR and HR process
Rao Muhammad Umer
C. Micheloni
32
1
0
19 Oct 2022
Sparse Deep Neural Network for Nonlinear Partial Differential Equations
Sparse Deep Neural Network for Nonlinear Partial Differential Equations
Yuesheng Xu
T. Zeng
30
5
0
27 Jul 2022
Approximation of Lipschitz Functions using Deep Spline Neural Networks
Approximation of Lipschitz Functions using Deep Spline Neural Networks
Sebastian Neumayer
Alexis Goujon
Pakshal Bohra
M. Unser
21
15
0
13 Apr 2022
Fully-Connected Network on Noncompact Symmetric Space and Ridgelet
  Transform based on Helgason-Fourier Analysis
Fully-Connected Network on Noncompact Symmetric Space and Ridgelet Transform based on Helgason-Fourier Analysis
Sho Sonoda
Isao Ishikawa
Masahiro Ikeda
19
15
0
03 Mar 2022
A Data-Augmentation Is Worth A Thousand Samples: Exact Quantification
  From Analytical Augmented Sample Moments
A Data-Augmentation Is Worth A Thousand Samples: Exact Quantification From Analytical Augmented Sample Moments
Randall Balestriero
Ishan Misra
Yann LeCun
27
20
0
16 Feb 2022
Measuring Complexity of Learning Schemes Using Hessian-Schatten Total
  Variation
Measuring Complexity of Learning Schemes Using Hessian-Schatten Total Variation
Shayan Aziznejad
Joaquim Campos
M. Unser
11
9
0
12 Dec 2021
Training Neural Networks for Solving 1-D Optimal Piecewise Linear
  Approximation
Training Neural Networks for Solving 1-D Optimal Piecewise Linear Approximation
Hangcheng Dong
Jing-Xiao Liao
Yan Wang
Yixin Chen
Bingguo Liu
Dong Ye
Guodong Liu
54
0
0
14 Oct 2021
What Kinds of Functions do Deep Neural Networks Learn? Insights from
  Variational Spline Theory
What Kinds of Functions do Deep Neural Networks Learn? Insights from Variational Spline Theory
Rahul Parhi
Robert D. Nowak
MLT
27
70
0
07 May 2021
Fast Jacobian-Vector Product for Deep Networks
Fast Jacobian-Vector Product for Deep Networks
Randall Balestriero
Richard Baraniuk
23
4
0
01 Apr 2021
Deep Neural Networks Are Effective At Learning High-Dimensional
  Hilbert-Valued Functions From Limited Data
Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
34
29
0
11 Dec 2020
Native Banach spaces for splines and variational inverse problems
Native Banach spaces for splines and variational inverse problems
M. Unser
Julien Fageot
14
17
0
24 Apr 2019
A unifying representer theorem for inverse problems and machine learning
A unifying representer theorem for inverse problems and machine learning
M. Unser
13
69
0
02 Mar 2019
On the Spectral Bias of Neural Networks
On the Spectral Bias of Neural Networks
Nasim Rahaman
A. Baratin
Devansh Arpit
Felix Dräxler
Min-Bin Lin
Fred Hamprecht
Yoshua Bengio
Aaron Courville
28
1,386
0
22 Jun 2018
Mad Max: Affine Spline Insights into Deep Learning
Mad Max: Affine Spline Insights into Deep Learning
Randall Balestriero
Richard Baraniuk
AI4CE
28
78
0
17 May 2018
1