ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.08308
  4. Cited By
Learning with convolution and pooling operations in kernel methods
v1v2 (latest)

Learning with convolution and pooling operations in kernel methods

16 November 2021
Theodor Misiakiewicz
Song Mei
    MLT
ArXiv (abs)PDFHTML

Papers citing "Learning with convolution and pooling operations in kernel methods"

16 / 16 papers shown
Title
Locality defeats the curse of dimensionality in convolutional
  teacher-student scenarios
Locality defeats the curse of dimensionality in convolutional teacher-student scenarios
Alessandro Favero
Francesco Cagnetta
Matthieu Wyart
79
31
0
16 Jun 2021
Learning with invariances in random features and kernel models
Learning with invariances in random features and kernel models
Song Mei
Theodor Misiakiewicz
Andrea Montanari
OOD
102
91
0
25 Feb 2021
Approximation and Learning with Deep Convolutional Models: a Kernel
  Perspective
Approximation and Learning with Deep Convolutional Models: a Kernel Perspective
A. Bietti
81
30
0
19 Feb 2021
Generalization error of random features and kernel methods:
  hypercontractivity and kernel matrix concentration
Generalization error of random features and kernel methods: hypercontractivity and kernel matrix concentration
Song Mei
Theodor Misiakiewicz
Andrea Montanari
91
112
0
26 Jan 2021
The Unreasonable Effectiveness of Patches in Deep Convolutional Kernels
  Methods
The Unreasonable Effectiveness of Patches in Deep Convolutional Kernels Methods
L. Thiry
Michael Arbel
Eugene Belilovsky
Edouard Oyallon
AAML
56
14
0
19 Jan 2021
Why Are Convolutional Nets More Sample-Efficient than Fully-Connected
  Nets?
Why Are Convolutional Nets More Sample-Efficient than Fully-Connected Nets?
Zhiyuan Li
Yi Zhang
Sanjeev Arora
BDLMLT
83
39
0
16 Oct 2020
Computational Separation Between Convolutional and Fully-Connected
  Networks
Computational Separation Between Convolutional and Fully-Connected Networks
Eran Malach
Shai Shalev-Shwartz
76
26
0
03 Oct 2020
When Do Neural Networks Outperform Kernel Methods?
When Do Neural Networks Outperform Kernel Methods?
Behrooz Ghorbani
Song Mei
Theodor Misiakiewicz
Andrea Montanari
98
189
0
24 Jun 2020
Enhanced Convolutional Neural Tangent Kernels
Enhanced Convolutional Neural Tangent Kernels
Zhiyuan Li
Ruosong Wang
Dingli Yu
S. Du
Wei Hu
Ruslan Salakhutdinov
Sanjeev Arora
68
133
0
03 Nov 2019
Linearized two-layers neural networks in high dimension
Linearized two-layers neural networks in high dimension
Behrooz Ghorbani
Song Mei
Theodor Misiakiewicz
Andrea Montanari
MLT
81
243
0
27 Apr 2019
On Lazy Training in Differentiable Programming
On Lazy Training in Differentiable Programming
Lénaïc Chizat
Edouard Oyallon
Francis R. Bach
111
840
0
19 Dec 2018
Stochastic Gradient Descent Optimizes Over-parameterized Deep ReLU
  Networks
Stochastic Gradient Descent Optimizes Over-parameterized Deep ReLU Networks
Difan Zou
Yuan Cao
Dongruo Zhou
Quanquan Gu
ODL
202
448
0
21 Nov 2018
Gradient Descent Provably Optimizes Over-parameterized Neural Networks
Gradient Descent Provably Optimizes Over-parameterized Neural Networks
S. Du
Xiyu Zhai
Barnabás Póczós
Aarti Singh
MLTODL
236
1,276
0
04 Oct 2018
End-to-End Kernel Learning with Supervised Convolutional Kernel Networks
End-to-End Kernel Learning with Supervised Convolutional Kernel Networks
Julien Mairal
SSL
71
130
0
20 May 2016
The spectrum of kernel random matrices
The spectrum of kernel random matrices
N. Karoui
173
224
0
04 Jan 2010
Testing for Homogeneity with Kernel Fisher Discriminant Analysis
Testing for Homogeneity with Kernel Fisher Discriminant Analysis
Zaïd Harchaoui
Francis R. Bach
Eric Moulines
125
119
0
07 Apr 2008
1