ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.12465
34
11

Robust, randomized preconditioning for kernel ridge regression

24 April 2023
Mateo Díaz
Ethan N. Epperly
Zachary Frangella
J. Tropp
R. Webber
ArXivPDFHTML
Abstract

This paper investigates two randomized preconditioning techniques for solving kernel ridge regression (KRR) problems with a medium to large number of data points (104≤N≤10710^4 \leq N \leq 10^7104≤N≤107), and it introduces two new methods with state-of-the-art performance. The first method, RPCholesky preconditioning, accurately solves the full-data KRR problem in O(N2)O(N^2)O(N2) arithmetic operations, assuming sufficiently rapid polynomial decay of the kernel matrix eigenvalues. The second method, KRILL preconditioning, offers an accurate solution to a restricted version of the KRR problem involving k≪Nk \ll Nk≪N selected data centers at a cost of O((N+k2)klog⁡k)O((N + k^2) k \log k)O((N+k2)klogk) operations. The proposed methods solve a broad range of KRR problems, making them ideal for practical applications.

View on arXiv
Comments on this paper