ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.06081
  4. Cited By
Deep Neural Networks Are Effective At Learning High-Dimensional
  Hilbert-Valued Functions From Limited Data

Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data

11 December 2020
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
ArXivPDFHTML

Papers citing "Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data"

18 / 18 papers shown
Title
Operator Learning Using Random Features: A Tool for Scientific Computing
Operator Learning Using Random Features: A Tool for Scientific Computing
Nicholas H. Nelsen
Andrew M. Stuart
37
12
0
12 Aug 2024
Physics-informed deep learning and compressive collocation for
  high-dimensional diffusion-reaction equations: practical existence theory and
  numerics
Physics-informed deep learning and compressive collocation for high-dimensional diffusion-reaction equations: practical existence theory and numerics
Simone Brugiapaglia
N. Dexter
Samir Karam
Weiqi Wang
AI4CE
DiffM
37
1
0
03 Jun 2024
Learning smooth functions in high dimensions: from sparse polynomials to
  deep neural networks
Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
37
4
0
04 Apr 2024
Response Theory via Generative Score Modeling
Response Theory via Generative Score Modeling
L. T. Giorgini
Katherine Deck
Tobias Bischoff
Andre N. Souza
38
9
0
01 Feb 2024
A practical existence theorem for reduced order models based on
  convolutional autoencoders
A practical existence theorem for reduced order models based on convolutional autoencoders
N. R. Franco
Simone Brugiapaglia
AI4CE
29
4
0
01 Feb 2024
A unified framework for learning with nonlinear model classes from
  arbitrary linear samples
A unified framework for learning with nonlinear model classes from arbitrary linear samples
Ben Adcock
Juan M. Cardenas
N. Dexter
29
3
0
25 Nov 2023
Neural Snowflakes: Universal Latent Graph Inference via Trainable Latent Geometries
Neural Snowflakes: Universal Latent Graph Inference via Trainable Latent Geometries
Haitz Sáez de Ocáriz Borde
Anastasis Kratsios
31
4
0
23 Oct 2023
Active Learning for Single Neuron Models with Lipschitz Non-Linearities
Active Learning for Single Neuron Models with Lipschitz Non-Linearities
Aarshvi Gajjar
C. Hegde
Christopher Musco
14
11
0
24 Oct 2022
CAS4DL: Christoffel Adaptive Sampling for function approximation via
  Deep Learning
CAS4DL: Christoffel Adaptive Sampling for function approximation via Deep Learning
Ben Adcock
Juan M. Cardenas
N. Dexter
25
8
0
25 Aug 2022
Compressive Fourier collocation methods for high-dimensional diffusion
  equations with periodic boundary conditions
Compressive Fourier collocation methods for high-dimensional diffusion equations with periodic boundary conditions
Weiqi Wang
Simone Brugiapaglia
17
2
0
02 Jun 2022
Optimal Learning
Optimal Learning
P. Binev
A. Bonito
Ronald A. DeVore
G. Petrova
FedML
30
0
0
30 Mar 2022
On efficient algorithms for computing near-best polynomial
  approximations to high-dimensional, Hilbert-valued functions from limited
  samples
On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
9
9
0
25 Mar 2022
A phase transition for finding needles in nonlinear haystacks with LASSO
  artificial neural networks
A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks
Xiaoyu Ma
S. Sardy
N. Hengartner
Nikolai Bobenko
Yen Ting Lin
19
2
0
21 Jan 2022
Convergence Rates for Learning Linear Operators from Noisy Data
Convergence Rates for Learning Linear Operators from Noisy Data
Maarten V. de Hoop
Nikola B. Kovachki
Nicholas H. Nelsen
Andrew M. Stuart
6
54
0
27 Aug 2021
Neural Network Training Using $\ell_1$-Regularization and Bi-fidelity
  Data
Neural Network Training Using ℓ1\ell_1ℓ1​-Regularization and Bi-fidelity Data
Subhayan De
Alireza Doostan
13
24
0
27 May 2021
The Random Feature Model for Input-Output Maps between Banach Spaces
The Random Feature Model for Input-Output Maps between Banach Spaces
Nicholas H. Nelsen
Andrew M. Stuart
13
140
0
20 May 2020
Numerical Solution of the Parametric Diffusion Equation by Deep Neural
  Networks
Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks
Moritz Geist
P. Petersen
Mones Raslan
R. Schneider
Gitta Kutyniok
16
83
0
25 Apr 2020
The troublesome kernel -- On hallucinations, no free lunches and the
  accuracy-stability trade-off in inverse problems
The troublesome kernel -- On hallucinations, no free lunches and the accuracy-stability trade-off in inverse problems
N. Gottschling
Vegard Antun
A. Hansen
Ben Adcock
24
31
0
05 Jan 2020
1