ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.08436
  4. Cited By
Analyzing the discrepancy principle for kernelized spectral filter
  learning algorithms

Analyzing the discrepancy principle for kernelized spectral filter learning algorithms

17 April 2020
Alain Celisse
Martin Wahl
ArXivPDFHTML

Papers citing "Analyzing the discrepancy principle for kernelized spectral filter learning algorithms"

5 / 5 papers shown
Title
Spectral Algorithms on Manifolds through Diffusion
Spectral Algorithms on Manifolds through Diffusion
Weichun Xia
Lei Shi
26
1
0
06 Mar 2024
On the Optimality of Misspecified Spectral Algorithms
On the Optimality of Misspecified Spectral Algorithms
Hao Zhang
Yicheng Li
Qian Lin
25
15
0
27 Mar 2023
Learning Lipschitz Functions by GD-trained Shallow Overparameterized
  ReLU Neural Networks
Learning Lipschitz Functions by GD-trained Shallow Overparameterized ReLU Neural Networks
Ilja Kuzborskij
Csaba Szepesvári
23
4
0
28 Dec 2022
From inexact optimization to learning via gradient concentration
From inexact optimization to learning via gradient concentration
Bernhard Stankewitz
Nicole Mücke
Lorenzo Rosasco
26
5
0
09 Jun 2021
Minimum discrepancy principle strategy for choosing $k$ in $k$-NN
  regression
Minimum discrepancy principle strategy for choosing kkk in kkk-NN regression
Yaroslav Averyanov
Alain Celisse
21
0
0
20 Aug 2020
1