Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.07764
Cited By
On the inability of Gaussian process regression to optimally learn compositional functions
16 May 2022
M. Giordano
Kolyan Ray
Johannes Schmidt-Hieber
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the inability of Gaussian process regression to optimally learn compositional functions"
11 / 11 papers shown
Title
Support Collapse of Deep Gaussian Processes with Polynomial Kernels for a Wide Regime of Hyperparameters
Daryna Chernobrovkina
Steffen Grünewälder
51
0
0
15 Mar 2025
On strong posterior contraction rates for Besov-Laplace priors in the white noise model
Emanuele Dolera
Stefano Favaro
Matteo Giordano
26
0
0
11 Nov 2024
How DNNs break the Curse of Dimensionality: Compositionality and Symmetry Learning
Arthur Jacot
Seok Hoan Choi
Yuxiao Wen
AI4CE
91
2
0
08 Jul 2024
Deep Horseshoe Gaussian Processes
Ismael Castillo
Thibault Randrianarisoa
BDL
UQCV
42
5
0
04 Mar 2024
Deep Gaussian Process Priors for Bayesian Inference in Nonlinear Inverse Problems
Kweku Abraham
Neil Deo
20
5
0
21 Dec 2023
Deep Latent Force Models: ODE-based Process Convolutions for Bayesian Deep Learning
Thomas Baldwin-McDonald
Mauricio A. Álvarez
39
1
0
24 Nov 2023
Besov-Laplace priors in density estimation: optimal posterior contraction rates and adaptation
M. Giordano
21
4
0
30 Aug 2022
What Can Be Learnt With Wide Convolutional Neural Networks?
Francesco Cagnetta
Alessandro Favero
M. Wyart
MLT
38
11
0
01 Aug 2022
Posterior contraction rates for constrained deep Gaussian processes in density estimation and classication
F. Bachoc
A. Lagnoux
21
4
0
14 Dec 2021
Posterior contraction for deep Gaussian process priors
G. Finocchio
Johannes Schmidt-Hieber
35
10
0
16 May 2021
Benefits of depth in neural networks
Matus Telgarsky
142
602
0
14 Feb 2016
1