ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.08260
32
7

Fast Kernel Summation in High Dimensions via Slicing and Fourier Transforms

16 January 2024
Johannes Hertrich
ArXivPDFHTML
Abstract

Kernel-based methods are heavily used in machine learning. However, they suffer from O(N2)O(N^2)O(N2) complexity in the number NNN of considered data points. In this paper, we propose an approximation procedure, which reduces this complexity to O(N)O(N)O(N). Our approach is based on two ideas. First, we prove that any radial kernel with analytic basis function can be represented as sliced version of some one-dimensional kernel and derive an analytic formula for the one-dimensional counterpart. It turns out that the relation between one- and ddd-dimensional kernels is given by a generalized Riemann-Liouville fractional integral. Hence, we can reduce the ddd-dimensional kernel summation to a one-dimensional setting. Second, for solving these one-dimensional problems efficiently, we apply fast Fourier summations on non-equispaced data, a sorting algorithm or a combination of both. Due to its practical importance we pay special attention to the Gaussian kernel, where we show a dimension-independent error bound and represent its one-dimensional counterpart via a closed-form Fourier transform. We provide a run time comparison and error estimate of our fast kernel summations.

View on arXiv
Comments on this paper