ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1610.07487
28
15

Parallelizing Spectral Algorithms for Kernel Learning

24 October 2016
Gilles Blanchard
Nicole Mücke
ArXivPDFHTML
Abstract

We consider a distributed learning approach in supervised learning for a large class of spectral regularization methods in an RKHS framework. The data set of size n is partitioned into m=O(nα)m=O(n^\alpha)m=O(nα) disjoint subsets. On each subset, some spectral regularization method (belonging to a large class, including in particular Kernel Ridge Regression, L2L^2L2-boosting and spectral cut-off) is applied. The regression function fff is then estimated via simple averaging, leading to a substantial reduction in computation time. We show that minimax optimal rates of convergence are preserved if m grows sufficiently slowly (corresponding to an upper bound for α\alphaα) as n→∞n \to \inftyn→∞, depending on the smoothness assumptions on fff and the intrinsic dimensionality. In spirit, our approach is classical.

View on arXiv
Comments on this paper